ACTS
Experiment-independent tracking
Loading...
Searching...
No Matches
ActsPlugins::OnnxRuntimeBase Class Reference

#include </home/runner/work/acts/acts/Plugins/Onnx/include/ActsPlugins/Onnx/OnnxRuntimeBase.hpp>

Inheritance diagram for ActsPlugins::OnnxRuntimeBase:
[legend]

Public Member Functions

 OnnxRuntimeBase ()=default
 Default constructor.
 OnnxRuntimeBase (Ort::Env &env, const char *modelPath)
 Parametrized constructor.
 ~OnnxRuntimeBase ()=default
 Default destructor.
std::vector< std::vector< float > > runONNXInference (NetworkBatchInput &inputTensorValues) const
 Run the ONNX inference function for a batch of input.
std::vector< float > runONNXInference (std::vector< float > &inputTensorValues) const
 Run the ONNX inference function.
std::vector< std::vector< std::vector< float > > > runONNXInferenceMultiOutput (NetworkBatchInput &inputTensorValues) const
 Run the multi-output ONNX inference function for a batch of input.

Constructor & Destructor Documentation

◆ OnnxRuntimeBase() [1/2]

ActsPlugins::OnnxRuntimeBase::OnnxRuntimeBase ( )
default

Default constructor.

◆ OnnxRuntimeBase() [2/2]

ActsPlugins::OnnxRuntimeBase::OnnxRuntimeBase ( Ort::Env & env,
const char * modelPath )

Parametrized constructor.

Parameters
envthe ONNX runtime environment
modelPaththe path to the ML model in *.onnx format

◆ ~OnnxRuntimeBase()

ActsPlugins::OnnxRuntimeBase::~OnnxRuntimeBase ( )
default

Default destructor.

Member Function Documentation

◆ runONNXInference() [1/2]

std::vector< std::vector< float > > ActsPlugins::OnnxRuntimeBase::runONNXInference ( NetworkBatchInput & inputTensorValues) const

Run the ONNX inference function for a batch of input.

Parameters
inputTensorValuesVector of the input feature values of all the inputs used for prediction
Returns
The vector of output (predicted) values

◆ runONNXInference() [2/2]

std::vector< float > ActsPlugins::OnnxRuntimeBase::runONNXInference ( std::vector< float > & inputTensorValues) const

Run the ONNX inference function.

Parameters
inputTensorValuesThe input feature values used for prediction
Returns
The output (predicted) values

◆ runONNXInferenceMultiOutput()

std::vector< std::vector< std::vector< float > > > ActsPlugins::OnnxRuntimeBase::runONNXInferenceMultiOutput ( NetworkBatchInput & inputTensorValues) const

Run the multi-output ONNX inference function for a batch of input.

Parameters
inputTensorValuesVector of the input feature values of all the inputs used for prediction
Returns
The vector of output (predicted) values, one for each output