iVS3D v2.0.0
Loading...
Searching...
No Matches
NN::OrtNeuralNet Class Reference

A class that implements the NeuralNet interface using ONNX Runtime. More...

#include <OrtNeuralNet.h>

Inheritance diagram for NN::OrtNeuralNet:
NN::NeuralNet

Public Member Functions

 OrtNeuralNet (const std::string &modelPath, bool useCuda=false, int gpuId=0)
 Construct a new OrtNeuralNet object.
 
tl::expected< Tensor, NeuralErrorinfer (const Tensor &input) override
 Perform inference on the given input Tensor using the ONNX model.
 
Shape inputShape () const override
 Get the input shape of the neural network.
 
Shape outputShape () const override
 Get the output shape of the neural network.
 
int gpuId () const override
 Get the GPU ID used by the neural network if it is configured to use GPU.
 
- Public Member Functions inherited from NN::NeuralNet
tl::expected< Tensor, NeuralErroroperator() (const Tensor &input)
 Call the infer method with the given input tensor.
 

Detailed Description

A class that implements the NeuralNet interface using ONNX Runtime.

This class is responsible for loading an ONNX model, performing inference, and converting between Tensor and ONNX Runtime's Ort::Value. It supports both CPU and GPU execution, depending on the model and the environment setup.

Note
DO NOT use this class directly in your code. Instead, use the NeuralNetFactory and NeuralNet interface to interact with neural networks.

Usage:

// Create an OrtNeuralNet instance
auto neuralNet = NeuralNetFactory::create("model.onnx");
if (!neuralNet) {
std::cerr << "Failed to create neural network: " << neuralNet.error() << std::endl;
return -1;
}
// Create a Tensor with the correct shape and data type
Shape inputShape = neuralNet->inputShape(); // assume it returns {1, 3, 224, 224} for a single image input
cv::Mat inputMat(224, 224, CV_32FC3, cv::Scalar(0.0f)); // Example input
Tensor inputTensor = Tensor::fromCvMat(inputMat).value(); // Convert cv::Mat to Tensor
// Perform inference
auto output = neuralNet->infer(inputTensor);
if (!output) {
std::cerr << "Inference failed: " << output.error() << std::endl;
return -1;
}
Tensor outputTensor = output.value();
static tl::expected< NeuralNetPtr, NeuralError > create(const std::string &modelPath, bool useGpu=false, int gpuId=0)
Create a NeuralNet instance from a model file.
Definition NeuralNetFactory.cpp:4
Shape inputShape() const override
Get the input shape of the neural network.
Definition OrtNeuralNet.cpp:105
A Tensor represents a N-dimensional array containing elements of the same type. Can be used as input ...
Definition Tensor.h:201
static tl::expected< Tensor, NeuralError > fromCvMat(const cv::Mat &mat)
Create a new Tensor object from a cv::Mat. This will convert from CVs HWC layout to ONNX standard lay...
Definition Tensor.cpp:6
std::vector< int64_t > Shape
Shape of a N-dimensional Tensor represented as the size in each dimension. Can be -1 in case of dynam...
Definition Tensor.h:75

Constructor & Destructor Documentation

◆ OrtNeuralNet()

NN::OrtNeuralNet::OrtNeuralNet ( const std::string &  modelPath,
bool  useCuda = false,
int  gpuId = 0 
)

Construct a new OrtNeuralNet object.

Parameters
modelPathThe path to the ONNX model file.
useCudaWhether to use CUDA for GPU execution. Default is false (CPU).
gpuIdThe ID of the GPU to use if CUDA is enabled. Default is 0.
Note
DO NOT use this constructor directly in your code. Instead, use the NeuralNetFactory to create an instance of this class.

Member Function Documentation

◆ gpuId()

int NN::OrtNeuralNet::gpuId ( ) const
overridevirtual

Get the GPU ID used by the neural network if it is configured to use GPU.

Returns
int The GPU ID, or -1 if the neural network does not use GPU.

Implements NN::NeuralNet.

◆ infer()

tl::expected< NN::Tensor, NN::NeuralError > NN::OrtNeuralNet::infer ( const Tensor input)
overridevirtual

Perform inference on the given input Tensor using the ONNX model.

Parameters
inputThe input Tensor to the neural network. This tensor must have the correct shape and data type expected by the model.
Returns
tl::expected<Tensor, NeuralError> The output Tensor or an error object.

Implements NN::NeuralNet.

◆ inputShape()

NN::Shape NN::OrtNeuralNet::inputShape ( ) const
overridevirtual

Get the input shape of the neural network.

Returns
Shape The shape of the input tensor expected by the model.

Implements NN::NeuralNet.

◆ outputShape()

NN::Shape NN::OrtNeuralNet::outputShape ( ) const
overridevirtual

Get the output shape of the neural network.

Returns
Shape The shape of the output tensor produced by the model.

Implements NN::NeuralNet.


The documentation for this class was generated from the following files: