AICentral 0.6.12-pullrequest0039-0002
This is a prerelease version of AICentral.
There is a newer version of this package available.
See the version list below for details.
See the version list below for details.
dotnet add package AICentral --version 0.6.12-pullrequest0039-0002
NuGet\Install-Package AICentral -Version 0.6.12-pullrequest0039-0002
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="AICentral" Version="0.6.12-pullrequest0039-0002" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add AICentral --version 0.6.12-pullrequest0039-0002
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: AICentral, 0.6.12-pullrequest0039-0002"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install AICentral as a Cake Addin #addin nuget:?package=AICentral&version=0.6.12-pullrequest0039-0002&prerelease // Install AICentral as a Cake Tool #tool nuget:?package=AICentral&version=0.6.12-pullrequest0039-0002&prerelease
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
AI Central
AI Central gives you control over your AI services.
- Intelligent Routing
- Custom consumer OAuth2 authorisation
- Fallback AI service
- Round Robin AI services
- Lowest Latency AI service
- Circuit breakers, and backoff-retry over downstream AI services
- Request based and Token based rate limiting
- Prompt and usage logging
- Works for streaming endpoints as-well as non streaming
Configuration
See docs on Github for more details.
Minimal
This sample produces a AI-Central proxy that
- Listens on a hostname of your choosing
- Proxies directly through to a back-end Open AI server
- Can be accessed using standard SDKs
{
"AICentral": {
"Endpoints": [
{
"Type": "AzureOpenAIEndpoint",
"Name": "openai-1",
"Properties": {
"LanguageEndpoint": "https://<my-ai>.openai.azure.com"
}
}
],
"EndpointSelectors": [
{
"Type": "SingleEndpoint",
"Name": "default",
"Properties": {
"Endpoint": "openai-1"
}
}
],
"Pipelines": [
{
"Name": "OpenAIPipeline",
"Host": "mypipeline.mydomain.com",
"EndpointSelector": "default"
}
]
}
}
Full example
This pipeline will:
- Present an Azure Open AI, and an Open AI downstream as a single upstream endpoint
- maps incoming Azure Open AI deployments to Open AI models
- Present it as an Azure Open AI style endpoint
- Protect the front-end by requiring an AAD token issued for your own AAD application
- Put a local Asp.Net core rate-limiting policy over the endpoint
- Add logging to Azure monitor
- Logs quota, client caller information, and in this case the Prompt but not the response.
{
"AICentral": {
"Endpoints": [
{
"Type": "AzureOpenAIEndpoint",
"Name": "openai-priority",
"Properties": {
"LanguageEndpoint": "https://<my-ai>.openai.azure.com",
"AuthenticationType": "Entra|EntraPassThrough|ApiKey",
"MaxConcurrency": 10
}
},
{
"Type": "OpenAIEndpoint",
"Name": "openai-fallback",
"Properties": {
"LanguageEndpoint": "https://api.openai.com",
"ModelMappings": {
"Gpt35Turbo0613": "gpt-3.5-turbo",
"Ada002Embedding": "text-embedding-ada-002"
},
"ApiKey": "<my-api-key>",
"Organization": "<optional-organisation>"
}
}
],
"AuthProviders": [
{
"Type": "Entra",
"Name": "simple-aad",
"Properties": {
"ClientId": "<my-client-id>",
"TenantId": "<my-tenant-id>",
"Instance": "https://login.microsoftonline.com/",
"Audience": "<custom-audience>"
}
}
],
"EndpointSelectors": [
{
"Type": "Prioritised",
"Name": "my-endpoint-selector",
"Properties": {
"PriorityEndpoints": ["openai-1"],
"FallbackEndpoints": ["openai-fallback"]
}
}
],
"GenericSteps": [
{
"Type": "AspNetCoreFixedWindowRateLimiting",
"Name": "token-rate-limiter",
"Properties": {
"LimitType": "PerConsumer|PerAICentralEndpoint",
"MetricType": "Tokens",
"Options": {
"Window": "00:01:00",
"PermitLimit": 1000
}
}
},
{
"Type": "AspNetCoreFixedWindowRateLimiting",
"Name": "window-rate-limiter",
"Properties": {
"LimitType": "PerConsumer|PerAICentralEndpoint",
"MetricType": "Requests",
"Options": {
"Window": "00:00:10",
"PermitLimit": 100
}
}
},
{
"Type": "AzureMonitorLogger",
"Name": "azure-monitor-logger",
"Properties": {
"WorkspaceId": "<workspace-id>",
"Key": "<key>",
"LogPrompt": true,
"LogResponse": false
}
},
{
"Type": "BulkHead",
"Name": "bulk-head",
"Properties": {
"MaxConcurrency": 20
}
}
],
"Pipelines": [
{
"Name": "MyPipeline",
"Host": "prioritypipeline.mydomain.com",
"EndpointSelector": "my-endpoint-selector",
"AuthProvider": "simple-aad",
"Steps": [
"window-rate-limiter",
"bulk-head",
"azure-monitor-logger"
]
}
]
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net8.0
- AICentral.Core (>= 0.6.12-pullrequest0039-0002)
- Azure.Identity (>= 1.10.4)
- Microsoft.AspNetCore.RateLimiting (>= 7.0.0-rc.2.22476.2)
- Microsoft.Extensions.Http.Polly (>= 8.0.0)
- Microsoft.Identity.Web (>= 2.16.0)
- Polly (>= 8.2.0)
- SharpToken (>= 1.2.14)
- System.Text.Json (>= 8.0.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.