Microsoft.ML.OnnxRuntime.Gpu.Windows
1.22.1
Prefix Reserved
dotnet add package Microsoft.ML.OnnxRuntime.Gpu.Windows --version 1.22.1
NuGet\Install-Package Microsoft.ML.OnnxRuntime.Gpu.Windows -Version 1.22.1
<PackageReference Include="Microsoft.ML.OnnxRuntime.Gpu.Windows" Version="1.22.1" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.Gpu.Windows" Version="1.22.1" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.Gpu.Windows" />
paket add Microsoft.ML.OnnxRuntime.Gpu.Windows --version 1.22.1
#r "nuget: Microsoft.ML.OnnxRuntime.Gpu.Windows, 1.22.1"
#:package Microsoft.ML.OnnxRuntime.Gpu.Windows@1.22.1
#addin nuget:?package=Microsoft.ML.OnnxRuntime.Gpu.Windows&version=1.22.1
#tool nuget:?package=Microsoft.ML.OnnxRuntime.Gpu.Windows&version=1.22.1
About
ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
Learn more about Target Frameworks and .NET Standard.
-
.NETCoreApp 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.1)
-
.NETFramework 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.1)
-
.NETStandard 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.1)
NuGet packages (3)
Showing the top 3 NuGet packages that depend on Microsoft.ML.OnnxRuntime.Gpu.Windows:
Package | Downloads |
---|---|
Microsoft.ML.OnnxRuntime.Gpu
This package contains native shared library artifacts for all supported platforms of ONNX Runtime. |
|
KokoroSharp.GPU.Windows
The Gpu.Windows runtime for KokoroSharp: an inference engine for Kokoro TTS with ONNX runtime, enabling fast and flexible local text-to-speech (fp/quanted) purely via C#. It features segment streaming, voice mixing, linear job scheduling, and optional playback. |
|
YoloSharpDeploGPU
使用onnx推理yolo模型(目前支持目标分类) |
GitHub repositories (1)
Showing the top 1 popular GitHub repositories that depend on Microsoft.ML.OnnxRuntime.Gpu.Windows:
Repository | Stars |
---|---|
Lyrcaxis/KokoroSharp
Fast local TTS inference engine in C# with ONNX runtime. Multi-speaker, multi-platform and multilingual. Integrate on your .NET projects using a plug-and-play NuGet package, complete with all voices.
|
Version | Downloads | Last Updated | |
---|---|---|---|
1.22.1 | 13,073 | 7/1/2025 | |
1.22.0 | 10,759 | 5/9/2025 | |
1.21.2 | 5,469 | 4/24/2025 | |
1.21.1 | 1,506 | 4/21/2025 | |
1.21.0 | 35,291 | 3/8/2025 | |
1.20.1 | 109,161 | 11/21/2024 | |
1.20.0 | 28,592 | 10/31/2024 | |
1.19.2 | 119,733 | 9/3/2024 | |
1.19.1 | 18,625 | 8/21/2024 | |
1.19.0 | 14,915 | 8/17/2024 | |
1.19.0-dev-20240812-1833-cc... | 1,787 | 8/13/2024 | |
1.18.1 | 39,374 | 6/27/2024 | |
1.18.0 | 27,324 | 5/17/2024 | |
1.17.3 | 47,034 | 4/10/2024 | |
1.17.1 | 39,477 | 2/25/2024 | |
1.17.0 | 33,134 | 1/31/2024 |
Release Def:
Branch: refs/heads/rel-1.22.1
Commit: 89746dc19a0a1ae59ebf4b16df9acab8f99f3925
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=845064