Prompty.Core
2.0.0-alpha.1
See the version list below for details.
dotnet add package Prompty.Core --version 2.0.0-alpha.1
NuGet\Install-Package Prompty.Core -Version 2.0.0-alpha.1
<PackageReference Include="Prompty.Core" Version="2.0.0-alpha.1" />
<PackageVersion Include="Prompty.Core" Version="2.0.0-alpha.1" />
<PackageReference Include="Prompty.Core" />
paket add Prompty.Core --version 2.0.0-alpha.1
#r "nuget: Prompty.Core, 2.0.0-alpha.1"
#:package Prompty.Core@2.0.0-alpha.1
#addin nuget:?package=Prompty.Core&version=2.0.0-alpha.1&prerelease
#tool nuget:?package=Prompty.Core&version=2.0.0-alpha.1&prerelease
Prompty.Core
Prompty is an asset class and format for LLM prompts designed to enhance
observability, understandability, and portability for developers. The
.prompty file format combines YAML frontmatter (model config, inputs,
tools) with a markdown body (your prompt template).
Prompty.Core is the foundation package — it loads .prompty files,
renders templates, parses messages, and orchestrates the execution pipeline.
Provider packages (OpenAI, Foundry, Anthropic) plug in via the invoker
registry.
Installation
dotnet add package Prompty.Core
You'll also need at least one provider package:
dotnet add package Prompty.OpenAI # OpenAI
dotnet add package Prompty.Foundry # Azure OpenAI / Microsoft Foundry
dotnet add package Prompty.Anthropic # Anthropic Claude
Quick Start
1. Create a .prompty file (chat.prompty):
---
name: basic-chat
model:
id: gpt-4o-mini
provider: openai
connection:
kind: key
endpoint: https://api.openai.com/v1
apiKey: ${env:OPENAI_API_KEY}
inputSchema:
properties:
- name: question
kind: string
---
system:
You are a helpful assistant.
user:
{{question}}
2. Load and run it in C#:
using Prompty.Core;
using Prompty.OpenAI;
// Register the provider
InvokerRegistry.RegisterExecutor("openai", new OpenAIExecutor());
InvokerRegistry.RegisterProcessor("openai", new OpenAIProcessor());
// Load, prepare messages, and execute
var agent = PromptyLoader.Load("chat.prompty");
var result = await Pipeline.InvokeAsync(agent, new Dictionary<string, object>
{
["question"] = "What is the meaning of life?"
});
Console.WriteLine(result);
Packages
| Package | Description |
|---|---|
| Prompty.Core | Loader, pipeline, renderers, parsers, tracing |
| Prompty.OpenAI | OpenAI provider — chat, embedding, image, responses API |
| Prompty.Foundry | Microsoft Foundry / Azure OpenAI — Entra ID + API key auth |
| Prompty.Anthropic | Anthropic Claude provider via Messages API |
Features
.promptyfile format — YAML frontmatter + Jinja2/Mustache template body- Pipeline architecture — Render → Parse → Execute → Process
- Structured output —
outputSchemaconverts toresponse_formatautomatically - Streaming —
PromptyStream(IAsyncEnumerable<object>) with tracing - Agent mode — Automatic tool-call loop with configurable max iterations
- Tracing — Pluggable backends: console,
.tracyfiles, OpenTelemetry - Connection registry — Pre-register SDK clients for reuse across prompts
- Spec compliance — Passes Prompty spec test vectors
Core API
Loading
// Sync
var agent = PromptyLoader.Load("chat.prompty");
// Async
var agent = await PromptyLoader.LoadAsync("chat.prompty");
Pipeline
// Full invoke (load → render → parse → execute → process)
var result = await Pipeline.InvokeAsync("chat.prompty", inputs);
// Step-by-step control
var validated = Pipeline.ValidateInputs(agent, inputs);
var rendered = await Pipeline.RenderAsync(agent, validated);
var messages = await Pipeline.ParseAsync(agent, rendered);
var response = await Pipeline.ExecuteAsync(agent, messages);
var result = await Pipeline.ProcessAsync(agent, response);
// Or use prepare + run
var messages = await Pipeline.PrepareAsync(agent, inputs); // render + parse
var result = await Pipeline.RunAsync(agent, messages); // execute + process
Agent Mode (Tool Calling)
var tools = new Dictionary<string, Func<string, Task<string>>>
{
["get_weather"] = async (args) => "72°F and sunny"
};
var result = await Pipeline.InvokeAgentAsync(
agent, inputs, tools, maxIterations: 10);
Tracing
// Register tracers before invoking
ConsoleTracer.Register(); // Print spans to console
PromptyTracer.Register(); // Write .tracy JSON files
OTelTracer.Register(); // OpenTelemetry spans
// Wrap custom code in a trace span
var result = await Trace.TraceAsync("my-operation", async () =>
{
// your code here
return value;
});
Connection Registry
// Pre-register an SDK client
ConnectionRegistry.Register("my-openai", openAIClient);
// Reference it in .prompty frontmatter:
// connection:
// kind: reference
// name: my-openai
Documentation
Visit prompty.ai for full documentation, guides, and the language specification.
License
MIT
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- Jinja2.NET (>= 1.4.1)
- Stubble.Core (>= 1.10.8)
- YamlDotNet (>= 16.3.0)
NuGet packages (4)
Showing the top 4 NuGet packages that depend on Prompty.Core:
| Package | Downloads |
|---|---|
|
Microsoft.SemanticKernel.Prompty
Semantic Kernel Prompty format support |
|
|
Prompty.OpenAI
OpenAI provider for Prompty — executor and processor for OpenAI chat, embedding, image, and agent APIs. |
|
|
Prompty.Anthropic
Anthropic provider for Prompty — executor and processor for Claude models via the Anthropic Messages API. |
|
|
Prompty.Foundry
Microsoft Foundry (Azure OpenAI) provider for Prompty — executor and processor for Azure-hosted models. |
GitHub repositories (1)
Showing the top 1 popular GitHub repositories that depend on Prompty.Core:
| Repository | Stars |
|---|---|
|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
| Version | Downloads | Last Updated |
|---|---|---|
| 2.0.0-alpha.10 | 55 | 4/14/2026 |
| 2.0.0-alpha.9 | 55 | 4/13/2026 |
| 2.0.0-alpha.8 | 53 | 4/10/2026 |
| 2.0.0-alpha.7 | 61 | 4/9/2026 |
| 2.0.0-alpha.6 | 51 | 4/8/2026 |
| 2.0.0-alpha.2 | 55 | 4/8/2026 |
| 2.0.0-alpha.1 | 61 | 4/8/2026 |
| 0.2.3-beta | 57,471 | 7/24/2025 |
| 0.2.2-beta | 43,980 | 4/24/2025 |
| 0.2.1-beta | 192 | 4/24/2025 |
| 0.2.0-beta | 192 | 4/24/2025 |
| 0.1.0-beta | 188 | 4/23/2025 |
| 0.0.23-alpha | 65,308 | 2/10/2025 |
| 0.0.22-alpha | 139 | 2/10/2025 |
| 0.0.14-alpha | 157 | 2/10/2025 |
| 0.0.13-alpha | 516 | 1/15/2025 |
| 0.0.12-alpha | 735 | 11/26/2024 |
| 0.0.11-alpha | 1,729 | 10/25/2024 |