Cortex.ML.Onnx
1.1.0
dotnet add package Cortex.ML.Onnx --version 1.1.0
NuGet\Install-Package Cortex.ML.Onnx -Version 1.1.0
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Cortex.ML.Onnx" Version="1.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Cortex.ML.Onnx" Version="1.1.0" />
<PackageReference Include="Cortex.ML.Onnx" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Cortex.ML.Onnx --version 1.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: Cortex.ML.Onnx, 1.1.0"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Cortex.ML.Onnx@1.1.0
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Cortex.ML.Onnx&version=1.1.0
#tool nuget:?package=Cortex.ML.Onnx&version=1.1.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
Cortex.ML.Onnx
ONNX Runtime model inference bridge for Cortex DataFrames.
Part of the Cortex data science ecosystem for .NET.
Requires .NET 10+ and the
Cortexcore package.
Features
- Load and run ONNX models directly on Cortex DataFrames
- Automatic input/output mapping between DataFrame columns and model tensors
- Hardware acceleration — CPU, CUDA, and DirectML execution providers
- Batch inference for high-throughput prediction workloads
Installation
dotnet add package Cortex.ML.Onnx
Quick Start
using Cortex;
using Cortex.ML.Onnx;
var df = DataFrame.ReadCsv("features.csv");
var session = new OnnxModel("model.onnx");
DataFrame predictions = session.Predict(df);
predictions.Head(5).Print();
GPU Inference
// Use CUDA execution provider for GPU inference
var session = new OnnxModel("model.onnx", executionProvider: "CUDA");
DataFrame predictions = session.Predict(largeBatch);
Related Packages
| Package | Description |
|---|---|
| Cortex | Core DataFrame (required) |
| Cortex.ML | Classical ML models and training |
| Cortex.ML.Torch | TorchSharp GPU training |
| Cortex.Vision | Vision models with ONNX inference |
Links
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net10.0
- Cortex (>= 1.1.0)
- Cortex.ML (>= 1.1.0)
- Microsoft.ML.OnnxRuntime (>= 1.24.4)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.