TokenFlow.AI
0.7.1
See the version list below for details.
dotnet add package TokenFlow.AI --version 0.7.1
NuGet\Install-Package TokenFlow.AI -Version 0.7.1
<PackageReference Include="TokenFlow.AI" Version="0.7.1" />
<PackageVersion Include="TokenFlow.AI" Version="0.7.1" />
<PackageReference Include="TokenFlow.AI" />
paket add TokenFlow.AI --version 0.7.1
#r "nuget: TokenFlow.AI, 0.7.1"
#:package TokenFlow.AI@0.7.1
#addin nuget:?package=TokenFlow.AI&version=0.7.1
#tool nuget:?package=TokenFlow.AI&version=0.7.1
<p align="center"> <img src="https://github.com/AndrewClements84/TokenFlow.AI/blob/master/assets/logo.png?raw=true" alt="TokenFlow.AI" width="500"/> </p>
TokenFlow.AI
💡 Overview
TokenFlow.AI is a lightweight .NET library for tokenization, chunking, and cost estimation across modern large language models (LLMs) such as OpenAI GPT‑4o, Anthropic Claude, and Azure OpenAI.
It provides accurate token counting, intelligent text splitting, cumulative usage tracking, and real‑time cost estimation for any AI‑driven application.
Now includes CLI utilities, developer documentation, full Flow.AI ecosystem integration, and automated performance benchmarking.
🧩 Key Features
- 🔢 GPT‑style token counting for .NET
- 🧱 Smart text chunking with configurable token limits and overlap
- 💰 Real‑time cost estimation for prompt and completion usage
- 🧮 TokenUsageTracker — track cumulative token and cost usage across analyses
- 🧩 Unified TokenFlowClient — analyze, chunk, and cost in one API
- ⚙️ CLI utilities (TokenFlow.Tools) — structured automation with
--format,--input, and--outputoptions - 📘 Developer documentation site — API reference + usage guides via GitHub Pages
- 🧾 Benchmark suite powered by BenchmarkDotNet and integrated with CI
- 🔌 Pluggable tokenizer providers — including OpenAI
tiktoken, Claudecl100k_base, and Approx fallback - 🔗 Flow.AI.Core integration — exposes
ITokenFlowProviderfor shared usage across Flow.AI ecosystem projects - 💬 CLI v3.0 alignment — enhanced cost commands, dynamic pricing, and Flow.AI registry integration
- 🧠 Dual targeting for .NET Standard 2.0 and .NET 8.0
📈 Benchmark Results (v0.7.0)
TokenFlow.AI includes full performance regression tracking integrated into CI using BenchmarkDotNet.
Results are automatically compared against a baseline to ensure no degradation beyond 10%, with benchmarks now fully toggleable via the RUN_BENCHMARKS environment flag.
| Benchmark | Mean (µs) | Error | Ratio | Allocations |
|---|---|---|---|---|
| TokenizerBenchmarks.CountTokens_OpenAI | 45.2 | 0.6 | 1.00x | 0 B |
| TokenizerBenchmarks.CountTokens_Claude | 46.8 | 0.7 | 1.04x | 0 B |
| ChunkerBenchmarks.ChunkByTokens | 210.4 | 1.8 | 1.00x | 512 B |
| CostEstimatorBenchmarks.EstimateTotalCost | 7.8 | 0.2 | 1.00x | 0 B |
Benchmarks are re‑run automatically in CI with each build, and can be disabled for faster builds using RUN_BENCHMARKS=false.
🧠 Quick Examples
Model-specific tokenizers:
using TokenFlow.Tokenizers.Factory;
var factory = new TokenizerFactory();
var gptTokenizer = factory.Create("gpt-4o");
var claudeTokenizer = factory.Create("claude-3-opus");
Console.WriteLine($"GPT tokens: {gptTokenizer.CountTokens("Hello world!")}");
Console.WriteLine($"Claude tokens: {claudeTokenizer.CountTokens("Hello world!")}");
Flow.AI.Core Provider Integration:
using Flow.AI.Core.Interfaces;
using TokenFlow.AI.Integration;
ITokenFlowProvider provider = new TokenFlowProvider("gpt-4o-mini");
int tokens = provider.CountTokens("gpt-4o-mini", "Hello Flow.AI!");
Console.WriteLine($"Token count: {tokens}");
CLI Cost Analysis:
tokenflow cost --model gpt-4o --input "Estimate my token cost"
Benchmarking tokenizers:
dotnet run -c Release --project src/TokenFlow.Tools.Benchmarks
Full benchmark documentation:
See docs/tokenizers.md
🧪 Running Tests
dotnet test --no-build --verbosity normal
All unit tests are written in xUnit and run automatically through GitHub Actions.
Code coverage is tracked with Codecov, and the project maintains 100% line and branch coverage across all modules.
📊 Code Coverage by Module
| Project | Coverage | Notes |
|---|---|---|
| TokenFlow.Core | 100% | Core models and interfaces |
| TokenFlow.AI | 100% | Client, costing, registry, Flow.AI integration |
| TokenFlow.Tokenizers | 100% | OpenAI, Claude, and Approx implementations |
| TokenFlow.Tools | 100% | CLI automation and output formatting |
🔗 Flow.AI.Core Integration
TokenFlow.AI fully implements the shared Flow.AI.Core.Interfaces.ITokenFlowProvider interface.
This enables all Flow.AI components — including PromptStream.AI, DataFlow.AI, and ChatFlow.AI —
to perform token counting and cost analysis through a unified provider contract.
TokenFlow.AI serves as the engine layer of the Flow.AI ecosystem, powering all higher-level orchestration frameworks.
🛠️ Roadmap
✅ Completed
- Core interfaces and models (
ITokenizer,ICostEstimator,ModelSpec,TokenCountResult) - Added
TokenFlow.Tokenizerswith advanced tokenizers (OpenAITikTokenizer,ClaudeTokenizer,ApproxTokenizer) - Extended
TokenizerFactoryto handle OpenAI/Claude families ✅ - Added TokenFlow.Tools.Benchmarks for tokenizer performance analysis ✅
- Achieved 100% code coverage across all projects ✅
- CLI v2.1 released with structured automation ✅
- Developer documentation site (API + usage guides) ✅
- Integrated Flow.AI.Core v0.1.0 and implemented
ITokenFlowProvider✅ - Full integration tests and shared registry loading ✅
- v0.6.1 — Performance Regression Tracking integrated with CI ✅
- v0.6.2 — Enhanced Cost Estimator using Flow.AI.Core registry ✅
- v0.7.0 — CLI Alignment & Ecosystem Integration ✅
🌟 Future Goals
- Extend CLI tooling for full Flow.AI ecosystem interoperability
- Implement enhanced Flow.AI shared configuration support
- Begin PromptStream.AI cockpit integration phase
💬 Contributing
Pull requests are welcome!
If you’d like to contribute to TokenFlow.AI, please read the upcoming CONTRIBUTING.md once published.
🪪 License
Distributed under the MIT License.
See LICENSE for details.
⭐ If you find TokenFlow.AI useful, please give the repository a star on GitHub!
It helps others discover the project and supports ongoing development.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Flow.AI.Core (>= 0.1.0)
- Newtonsoft.Json (>= 13.0.1)
- System.Text.Json (>= 9.0.9)
- TokenFlow.Core (>= 0.7.1)
-
net8.0
- Flow.AI.Core (>= 0.1.0)
- Newtonsoft.Json (>= 13.0.1)
- System.Text.Json (>= 9.0.9)
- TokenFlow.Core (>= 0.7.1)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on TokenFlow.AI:
| Package | Downloads |
|---|---|
|
PromptStream.AI
Token-aware prompt composition, validation, and conversational context toolkit for .NET. |
|
|
PromptStream.AI.Integration.TokenFlow
Integration adapter connecting PromptStream.AI with TokenFlow.AI for model-aware tokenization. |
GitHub repositories
This package is not used by any popular GitHub repositories.