Prompty.Core 2.0.0-alpha.10

This is a prerelease version of Prompty.Core.
dotnet add package Prompty.Core --version 2.0.0-alpha.10
                    
NuGet\Install-Package Prompty.Core -Version 2.0.0-alpha.10
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Prompty.Core" Version="2.0.0-alpha.10" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Prompty.Core" Version="2.0.0-alpha.10" />
                    
Directory.Packages.props
<PackageReference Include="Prompty.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Prompty.Core --version 2.0.0-alpha.10
                    
#r "nuget: Prompty.Core, 2.0.0-alpha.10"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Prompty.Core@2.0.0-alpha.10
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Prompty.Core&version=2.0.0-alpha.10&prerelease
                    
Install as a Cake Addin
#tool nuget:?package=Prompty.Core&version=2.0.0-alpha.10&prerelease
                    
Install as a Cake Tool

Prompty.Core

Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers. The .prompty file format combines YAML frontmatter (model config, inputs, tools) with a markdown body (your prompt template).

Prompty.Core is the foundation package — it loads .prompty files, renders templates, parses messages, and orchestrates the execution pipeline. Provider packages (OpenAI, Foundry, Anthropic) plug in via the invoker registry.

Installation

dotnet add package Prompty.Core

You'll also need at least one provider package:

dotnet add package Prompty.OpenAI      # OpenAI
dotnet add package Prompty.Foundry     # Azure OpenAI / Microsoft Foundry
dotnet add package Prompty.Anthropic   # Anthropic Claude

Quick Start

1. Create a .prompty file (chat.prompty):

---
name: basic-chat
model:
  id: gpt-4o-mini
  provider: openai
  connection:
    kind: key
    endpoint: https://api.openai.com/v1
    apiKey: ${env:OPENAI_API_KEY}
inputSchema:
  properties:
    - name: question
      kind: string
---
system:
You are a helpful assistant.

user:
{{question}}

2. Load and run it in C#:

using Prompty.Core;
using Prompty.OpenAI;

// One-time setup — registers renderers, parser, and providers
new PromptyBuilder()
    .AddOpenAI();

// Load, prepare messages, and execute
var agent = PromptyLoader.Load("chat.prompty");
var result = await Pipeline.InvokeAsync(agent, new Dictionary<string, object>
{
    ["question"] = "What is the meaning of life?"
});

Console.WriteLine(result);

Packages

Package Description
Prompty.Core Loader, pipeline, renderers, parsers, tracing
Prompty.OpenAI OpenAI provider — chat, embedding, image, responses API
Prompty.Foundry Microsoft Foundry / Azure OpenAI — Entra ID + API key auth
Prompty.Anthropic Anthropic Claude provider via Messages API

Features

  • .prompty file format — YAML frontmatter + Jinja2/Mustache template body
  • Pipeline architecture — Render → Parse → Execute → Process
  • Structured outputoutputSchema converts to response_format automatically
  • StreamingPromptyStream (IAsyncEnumerable<object>) with tracing
  • Agent mode — Automatic tool-call loop with configurable max iterations
  • Tracing — Pluggable backends: console, .tracy files, OpenTelemetry
  • Connection registry — Pre-register SDK clients for reuse across prompts
  • Spec compliance — Passes Prompty spec test vectors

Core API

Loading

// Sync
var agent = PromptyLoader.Load("chat.prompty");

// Async
var agent = await PromptyLoader.LoadAsync("chat.prompty");

Pipeline

// Full invoke (load → render → parse → execute → process)
var result = await Pipeline.InvokeAsync("chat.prompty", inputs);

// Step-by-step control
var validated = Pipeline.ValidateInputs(agent, inputs);
var rendered  = await Pipeline.RenderAsync(agent, validated);
var messages  = await Pipeline.ParseAsync(agent, rendered);
var response  = await Pipeline.ExecuteAsync(agent, messages);
var result    = await Pipeline.ProcessAsync(agent, response);

// Or use prepare + run
var messages = await Pipeline.PrepareAsync(agent, inputs);  // render + parse
var result   = await Pipeline.RunAsync(agent, messages);     // execute + process

Agent Mode (Tool Calling)

var tools = new Dictionary<string, Func<string, Task<string>>>
{
    ["get_weather"] = async (args) => "72°F and sunny"
};

var result = await Pipeline.TurnAsync(
    agent, inputs, tools, maxIterations: 10);

Tracing

// Register tracers before invoking
ConsoleTracer.Register();         // Print spans to console
PromptyTracer.Register();         // Write .tracy JSON files
OTelTracer.Register();            // OpenTelemetry spans

// Wrap custom code in a trace span
var result = await Trace.TraceAsync("my-operation", async () =>
{
    // your code here
    return value;
});

Connection Registry

// Pre-register an SDK client
ConnectionRegistry.Register("my-openai", openAIClient);

// Reference it in .prompty frontmatter:
// connection:
//   kind: reference
//   name: my-openai

Documentation

Visit prompty.ai for full documentation, guides, and the language specification.

License

MIT

Product Compatible and additional computed target framework versions.
.NET net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (4)

Showing the top 4 NuGet packages that depend on Prompty.Core:

Package Downloads
Microsoft.SemanticKernel.Prompty

Semantic Kernel Prompty format support

Prompty.OpenAI

OpenAI provider for Prompty — executor and processor for OpenAI chat, embedding, image, and agent APIs.

Prompty.Anthropic

Anthropic provider for Prompty — executor and processor for Claude models via the Anthropic Messages API.

Prompty.Foundry

Microsoft Foundry (Azure OpenAI) provider for Prompty — executor and processor for Azure-hosted models.

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on Prompty.Core:

Repository Stars
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
Version Downloads Last Updated
2.0.0-alpha.10 55 4/14/2026
2.0.0-alpha.9 55 4/13/2026
2.0.0-alpha.8 53 4/10/2026
2.0.0-alpha.7 61 4/9/2026
2.0.0-alpha.6 51 4/8/2026
2.0.0-alpha.2 55 4/8/2026
2.0.0-alpha.1 61 4/8/2026
0.2.3-beta 57,471 7/24/2025
0.2.2-beta 43,980 4/24/2025
0.2.1-beta 192 4/24/2025
0.2.0-beta 192 4/24/2025
0.1.0-beta 188 4/23/2025
0.0.23-alpha 65,308 2/10/2025
0.0.22-alpha 139 2/10/2025
0.0.14-alpha 157 2/10/2025
0.0.13-alpha 516 1/15/2025
0.0.12-alpha 735 11/26/2024
0.0.11-alpha 1,729 10/25/2024
Loading failed