ChatGPTSharp 3.0.1
dotnet add package ChatGPTSharp --version 3.0.1
NuGet\Install-Package ChatGPTSharp -Version 3.0.1
<PackageReference Include="ChatGPTSharp" Version="3.0.1" />
<PackageVersion Include="ChatGPTSharp" Version="3.0.1" />
<PackageReference Include="ChatGPTSharp" />
paket add ChatGPTSharp --version 3.0.1
#r "nuget: ChatGPTSharp, 3.0.1"
#:package ChatGPTSharp@3.0.1
#addin nuget:?package=ChatGPTSharp&version=3.0.1
#tool nuget:?package=ChatGPTSharp&version=3.0.1
ChatGPTSharp
Modern C# client for chat, tools, streaming, and multimodal messages (image/audio). It supports ConversationId-based continuity, flexible request extensions, and manual control when you need it.
Features
- ConversationId-based continuity
- Image and audio inputs via a unified content array
- Tool definitions + tool-call responses
- Streaming responses via
IAsyncEnumerable - Extra request fields via
ExtraBody - No local token counting or token limits
Installation
dotnet add package ChatGPTSharp
Quick Start (Stateless Chat)
using ChatGPTSharp;
using ChatGPTSharp.Model;
var settings = new ChatGPTClientSettings
{
OpenAIKey = File.ReadAllText("KEY.txt"),
ModelName = "gpt-4o-mini"
};
var client = new ChatGPTClient(settings);
var result = await client.SendMessage(new List<MessageContent>
{
MessageContent.FromText("Hello!")
});
Console.WriteLine(result.Response);
Continuous Conversation
Use SendMessageWithConversation to record history and return ConversationId + MessageId.
var first = await client.SendMessageWithConversation(new List<MessageContent>
{
MessageContent.FromText("Hello!")
});
var second = await client.SendMessageWithConversation(
new List<MessageContent> { MessageContent.FromText("Continue the conversation") },
conversationId: first.ConversationId ?? "",
parentMessageId: first.MessageId ?? "");
Console.WriteLine(second.Response);
Simplified Usage (Session)
Use CreateSession to keep state without manually passing ConversationId and MessageId.
It also supports a simplified content call without building List<MessageContent>.
var session = client.CreateSession(systemPrompt: "You are a concise assistant.");
var first = await session.SendAsync("Hello!");
var second = await session.SendAsync("Continue the conversation.");
var multimodal = await session.SendAsync(
"Describe this image.",
MessageContent.FromImageUrl("https://example.com/demo.png"));
Console.WriteLine(second.Response);
System Prompt
var result = await client.SendMessage(
new List<MessageContent> { MessageContent.FromText("Summarize this in one sentence.") },
systemPrompt: "You are a concise assistant.");
Image Input
var contents = new List<MessageContent>
{
MessageContent.FromText("Describe these images"),
MessageContent.FromImageFile(@"C:\Images\demo.jpg", ImageDetailMode.Low),
MessageContent.FromImageUrl("https://example.com/demo.png", ImageDetailMode.Auto)
};
var result = await client.SendMessage(contents);
Audio Input
var contents = new List<MessageContent>
{
MessageContent.FromText("Transcribe this audio"),
MessageContent.FromAudioUrl("https://example.com/sample.mp3"),
MessageContent.FromAudioFile(@"C:\Audio\sample.mp3", "audio/mpeg")
};
var result = await client.SendMessage(contents);
Streaming
await foreach (var evt in client.SendMessageStream(new List<MessageContent>
{
MessageContent.FromText("Tell me a story")
}))
{
if (evt.IsDone)
{
Console.WriteLine("\n[done]");
break;
}
if (!string.IsNullOrEmpty(evt.DeltaText))
{
Console.Write(evt.DeltaText);
}
}
Tools (Function Calling)
Define a tool schema, send a request, then execute the tool and return its result.
using System.Text.Json.Nodes;
var weatherSchema = JsonNode.Parse(@"{
\"type\": \"object\",
\"properties\": {
\"city\": { \"type\": \"string\" }
},
\"required\": [\"city\"]
}") as JsonObject;
var tools = new List<ToolDefinition>
{
ToolDefinition.CreateFunction(new ToolFunctionDefinition(
"get_weather",
"Get current weather",
weatherSchema))
};
var toolResult = await client.SendMessage(
new List<MessageContent> { MessageContent.FromText("What's the weather in Paris?") },
tools: tools);
if (toolResult.ToolCalls?.Count > 0)
{
var call = toolResult.ToolCalls[0];
var args = JsonNode.Parse(call.Function.Arguments) as JsonObject;
var city = args?["city"]?.GetValue<string>();
// Your tool execution
var weather = new JsonObject
{
["city"] = city,
["tempC"] = 18
};
// Continue with manual message list
var messages = new List<ChatMessage>
{
new ChatMessage
{
Role = RoleType.User,
Contents = new List<MessageContent> { MessageContent.FromText("What's the weather in Paris?") }
},
new ChatMessage { Role = RoleType.Assistant, ToolCalls = toolResult.ToolCalls },
new ChatMessage
{
Role = RoleType.Tool,
ToolCallId = call.Id,
Contents = new List<MessageContent> { MessageContent.FromText(weather.ToString()) }
}
};
var followup = await client.SendAsync(new ChatRequest
{
Messages = messages,
Tools = tools
});
Console.WriteLine(followup.Message?.GetTextContent());
}
Responses API (Server-Side State + Built-in Tools)
The Responses API supports server-side state via conversation or previous_response_id, and built-in tools.
using System.Text.Json.Nodes;
var response = await client.CreateResponseAsync(new ResponseRequest
{
Instructions = "You are concise.",
Input = MessageContent.BuildResponseInput(RoleType.User, new List<MessageContent>
{
MessageContent.FromText("Summarize this in one sentence.")
}),
Store = true,
Tools = new List<ResponseTool>
{
ResponseTool.BuiltIn("web_search")
}
});
Console.WriteLine(response.GetOutputText());
Create a conversation on the server and keep adding items:
var convo = await client.CreateConversationAsync();
var items = MessageContent.BuildResponseInput(RoleType.User, new List<MessageContent>
{
MessageContent.FromText("Remember this preference: I like short answers.")
});
await client.AddConversationItemsAsync(convo.Id ?? "", items);
var followup = await client.CreateResponseAsync(new ResponseRequest
{
ConversationId = convo.Id,
Input = MessageContent.BuildResponseInput(RoleType.User, new List<MessageContent>
{
MessageContent.FromText("What should you remember?")
})
});
Console.WriteLine(followup.GetOutputText());
ExtraBody (Request Extensions)
var extra = new Dictionary<string, object?>
{
["response_format"] = new { type = "json_object" }
};
var result = await client.SendMessage(
new List<MessageContent> { MessageContent.FromText("Return JSON with fields a and b") },
extraBody: extra);
You can also set defaults:
settings.ExtraBody = new Dictionary<string, object?>
{
["response_format"] = new { type = "json_object" }
};
Advanced: Manual Requests
Use SendAsync and StreamAsync with a ChatRequest when you want full control over messages.
var request = new ChatRequest
{
Model = "gpt-4o-mini",
Messages = new List<ChatMessage>
{
new ChatMessage
{
Role = RoleType.System,
Contents = new List<MessageContent> { MessageContent.FromText("You are concise.") }
},
new ChatMessage
{
Role = RoleType.User,
Contents = new List<MessageContent> { MessageContent.FromText("Explain async streams.") }
}
}
};
var response = await client.SendAsync(request);
Console.WriteLine(response.Message?.GetTextContent());
Configuration
var settings = new ChatGPTClientSettings
{
OpenAIKey = File.ReadAllText("KEY.txt"),
ModelName = "gpt-4o-mini",
BaseUrl = "https://api.openai.com/",
ProxyUri = "http://127.0.0.1:1080",
TimeoutSeconds = 60
};
Notes:
BaseUrlandProxyUrican be used to route requests.BaseUrlmay include/v1and will still resolve correctly.
This code base references node-chatgpt-api.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 is compatible. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 10.0.2)
- System.Text.Json (>= 8.0.5)
-
.NETStandard 2.1
- System.Text.Json (>= 8.0.5)
-
net7.0
- System.Text.Json (>= 8.0.5)
-
net8.0
- System.Text.Json (>= 8.0.5)
-
net9.0
- System.Text.Json (>= 8.0.5)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated | |
|---|---|---|---|
| 3.0.1 | 29 | 1/30/2026 | |
| 3.0.0 | 29 | 1/29/2026 | |
| 2.0.8 | 432 | 1/22/2025 | |
| 2.0.7 | 162 | 1/22/2025 | |
| 2.0.6 | 166 | 1/21/2025 | |
| 2.0.5 | 171 | 12/24/2024 | |
| 2.0.4 | 279 | 5/14/2024 | |
| 2.0.3 | 165 | 5/14/2024 | |
| 2.0.2 | 171 | 5/13/2024 | |
| 2.0.1 | 185 | 4/28/2024 | |
| 2.0.0 | 209 | 4/28/2024 | |
| 2.0.0-alpha.4 | 170 | 2/22/2024 | |
| 2.0.0-alpha.3 | 98 | 2/22/2024 | |
| 2.0.0-alpha.1 | 189 | 12/22/2023 | |
| 1.1.4 | 672 | 7/11/2023 | |
| 1.1.3 | 412 | 5/8/2023 | |
| 1.1.2 | 323 | 4/29/2023 | |
| 1.1.1 | 574 | 3/20/2023 | |
| 1.1.0 | 386 | 3/20/2023 | |
| 1.0.9 | 560 | 3/7/2023 | |
| 1.0.8 | 426 | 3/6/2023 | |
| 1.0.7 | 528 | 3/4/2023 | |
| 1.0.6 | 541 | 3/3/2023 | |
| 1.0.5 | 537 | 3/3/2023 | |
| 1.0.4 | 493 | 3/2/2023 | |
| 1.0.3 | 480 | 3/2/2023 | |
| 1.0.2 | 489 | 3/2/2023 | |
| 1.0.1 | 407 | 2/28/2023 | |
| 1.0.0 | 408 | 2/28/2023 |