Microsoft.KernelMemory.Core 0.24.231228.5

Prefix Reserved
This package has a SemVer 2.0.0 package version: 0.24.231228.5+eed4b78.
There is a newer version of this package available.
See the version list below for details.
dotnet add package Microsoft.KernelMemory.Core --version 0.24.231228.5                
NuGet\Install-Package Microsoft.KernelMemory.Core -Version 0.24.231228.5                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Microsoft.KernelMemory.Core" Version="0.24.231228.5" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Microsoft.KernelMemory.Core --version 0.24.231228.5                
#r "nuget: Microsoft.KernelMemory.Core, 0.24.231228.5"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Microsoft.KernelMemory.Core as a Cake Addin
#addin nuget:?package=Microsoft.KernelMemory.Core&version=0.24.231228.5

// Install Microsoft.KernelMemory.Core as a Cake Tool
#tool nuget:?package=Microsoft.KernelMemory.Core&version=0.24.231228.5                

Kernel Memory

License: MIT Discord

Kernel Memory (KM) is a multi-modal AI Service specialized in the efficient indexing of datasets through custom continuous data hybrid pipelines, with support for Retrieval Augmented Generation (RAG), synthetic memory, prompt engineering, and custom semantic memory processing.

KM includes a GPT Plugin, web clients, a .NET library for embedded applications, and soon as a Docker container.

image

Utilizing advanced embeddings and LLMs, the system enables Natural Language querying for obtaining answers from the indexed data, complete with citations and links to the original sources.

image

Designed for seamless integration as a Plugin with Semantic Kernel, Microsoft Copilot and ChatGPT, Kernel Memory enhances data-driven features in applications built for most popular AI platforms.

.NET packages

  • Microsoft.KernelMemory.WebClient: The web client library, can be used to call a running instance of the Memory web service. .NET Standard 2.0 compatible.

    Nuget package Example code

  • Microsoft.KernelMemory.SemanticKernelPlugin: a Memory plugin for Semantic Kernel, replacing the original Semantic Memory available in SK. .NET Standard 2.0 compatible.

    Nuget package Example code

  • Microsoft.KernelMemory.Abstractions: The internal interfaces and models shared by all packages, used to extend KM to support third party services. .NET Standard 2.0 compatible.

    Nuget package

  • Microsoft.KernelMemory.MemoryDb.AzureAISearch: Memory storage using Azure AI Search.

    Nuget package

  • Microsoft.KernelMemory.MemoryDb.Postgres: Memory storage using PostgreSQL.

    Nuget package

  • Microsoft.KernelMemory.MemoryDb.Qdrant: Memory storage using Qdrant.

    Nuget package

  • Microsoft.KernelMemory.AI.AzureOpenAI: Integration with Azure OpenAI LLMs.

    Nuget package

  • Microsoft.KernelMemory.AI.LlamaSharp: Integration with LLama LLMs.

    Nuget package

  • Microsoft.KernelMemory.AI.OpenAI: Integration with OpenAI LLMs.

    Nuget package

  • Microsoft.KernelMemory.DataFormats.AzureAIDocIntel: Integration with Azure AI Document Intelligence.

    Nuget package

  • Microsoft.KernelMemory.Orchestration.AzureQueues: Ingestion and synthetic memory pipelines via Azure Queue Storage.

    Nuget package

  • Microsoft.KernelMemory.Orchestration.RabbitMQ: Ingestion and synthetic memory pipelines via RabbitMQ.

    Nuget package

  • Microsoft.KernelMemory.ContentStorage.AzureBlobs: Used to store content on Azure Storage Blobs.

    Nuget package

  • Microsoft.KernelMemory.Core: The core library, can be used to build custom pipelines and handlers, and contains a serverless client to use memory in a synchronous way, without the web service. .NET 6+.

    Nuget package Example code

Packages for Python, Java and other languages

Kernel Memory service offers a Web API out of the box, including the OpenAPI swagger documentation that you can leverage to test the API and create custom web clients. For instance, after starting the service locally, see http://127.0.0.1:9001/swagger/index.html.

A python package with a Web Client and Semantic Kernel plugin will soon be available. We also welcome PR contributions to support more languages.

Data formats

  • MS Word documents
  • MS Excel spreadsheets
  • MS PowerPoint presentations
  • PDF documents
  • Web pages
  • JPG/PNG/TIFF Images with text via OCR
  • MarkDown
  • JSON
  • Raw plain text
  • [..] more coming 😃

Backends

ℹ️ NOTE: the documentation below is work in progress, will evolve quickly as is not fully functional yet.

Kernel Memory in serverless mode

Kernel Memory works and scales at best when running as a service, allowing to ingest thousands of documents and information without blocking your app.

However, you can use Kernel Memory also serverless, embedding the MemoryServerless class in your app.

Importing documents into your Kernel Memory can be as simple as this:

var memory = new KernelMemoryBuilder()
    .WithOpenAIDefaults(Env.Var("OPENAI_API_KEY"))
    .Build<MemoryServerless>();

// Import a file
await memory.ImportDocumentAsync("meeting-transcript.docx", tags: new() { { "user", "Blake" } });

// Import multiple files and apply multiple tags
await memory.ImportDocumentAsync(new Document("file001")
    .AddFile("business-plan.docx")
    .AddFile("project-timeline.pdf")
    .AddTag("user", "Blake")
    .AddTag("collection", "business")
    .AddTag("collection", "plans")
    .AddTag("fiscalYear", "2023"));

Asking questions:

var answer1 = await memory.AskAsync("How many people attended the meeting?");

var answer2 = await memory.AskAsync("what's the project timeline?", filter: new MemoryFilter().ByTag("user", "Blake"));

The code leverages the default documents ingestion pipeline:

  1. Extract text: recognize the file format and extract the information
  2. Partition the text in small chunks, to optimize search
  3. Extract embedding using an LLM embedding generator
  4. Save embedding into a vector index such as Azure AI Search, Qdrant or other DBs.

Documents are organized by users, safeguarding their private information. Furthermore, memories can be categorized and structured using tags, enabling efficient search and retrieval through faceted navigation.

Data lineage, citations

All memories and answers are fully correlated to the data provided. When producing an answer, Kernel Memory includes all the information needed to verify its accuracy:

await memory.ImportFileAsync("NASA-news.pdf");

var answer = await memory.AskAsync("Any news from NASA about Orion?");

Console.WriteLine(answer.Result + "/n");

foreach (var x in answer.RelevantSources)
{
    Console.WriteLine($"  * {x.SourceName} -- {x.Partitions.First().LastUpdate:D}");
}

Yes, there is news from NASA about the Orion spacecraft. NASA has invited the media to see a new test version of the Orion spacecraft and the hardware that will be used to recover the capsule and astronauts upon their return from space during the Artemis II mission. The event is scheduled to take place at Naval Base San Diego on Wednesday, August 2, at 11 a.m. PDT. Personnel from NASA, the U.S. Navy, and the U.S. Air Force will be available to speak with the media. Teams are currently conducting tests in the Pacific Ocean to demonstrate and evaluate the processes, procedures, and hardware for recovery operations for crewed Artemis missions. These tests will help prepare the team for Artemis II, which will be NASA's first crewed mission under the Artemis program. The Artemis II crew, consisting of NASA astronauts Reid Wiseman, Victor Glover, and Christina Koch, and Canadian Space Agency astronaut Jeremy Hansen, will participate in recovery testing at sea next year. For more information about the Artemis program, you can visit the NASA website.

  • NASA-news.pdf -- Tuesday, August 1, 2023

Using Kernel Memory Service

Depending on your scenarios, you might want to run all the code locally inside your process, or remotely through an asynchronous service.

If you're importing small files, and need only C# and can block the process during the import, local-in-process execution can be fine, using the MemoryServerless seen above.

However, if you are in one of these scenarios:

  • I'd just like a web service to import data and send queries to answer
  • My app is written in TypeScript, Java, Rust, or some other language
  • I want to define custom pipelines mixing multiple languages like Python, TypeScript, etc
  • I'm importing big documents that can require minutes to process, and I don't want to block the user interface
  • I need memory import to run independently, supporting failures and retry logic

then you can deploy Kernel Memory as a service, plugging in the default handlers or your custom Python/TypeScript/Java/etc. handlers, and leveraging the asynchronous non-blocking memory encoding process, sending documents and asking questions using the MemoryWebClient.

Here you can find a complete set of instruction about how to run the Kernel Memory service.

If you want to give the service a quick test, use the following command to start the Kernel Memory Service:

On WSL / Linux / MacOS:

cd service/Service
./setup.sh
./run.sh

On Windows:

cd service\Service
setup.cmd
run.cmd

To import files using Kernel Memory web service, use MemoryWebClient:

#reference clients/WebClient/WebClient.csproj

var memory = new MemoryWebClient("http://127.0.0.1:9001"); // <== URL where the web service is running

// Import a file (default user)
await memory.ImportDocumentAsync("meeting-transcript.docx");

// Import a file specifying a Document ID, User and Tags
await memory.ImportDocumentAsync("business-plan.docx",
    new DocumentDetails("user@some.email", "file001")
        .AddTag("collection", "business")
        .AddTag("collection", "plans")
        .AddTag("fiscalYear", "2023"));

Getting answers via the web service

curl http://127.0.0.1:9001/ask -d'{"query":"Any news from NASA about Orion?"}' -H 'Content-Type: application/json'
{
  "Query": "Any news from NASA about Orion?",
  "Text": "Yes, there is news from NASA about the Orion spacecraft. NASA has invited the media to see a new test version of the Orion spacecraft and the hardware that will be used to recover the capsule and astronauts upon their return from space during the Artemis II mission. The event is scheduled to take place at Naval Base San Diego on August 2nd at 11 a.m. PDT. Personnel from NASA, the U.S. Navy, and the U.S. Air Force will be available to speak with the media. Teams are currently conducting tests in the Pacific Ocean to demonstrate and evaluate the processes, procedures, and hardware for recovery operations for crewed Artemis missions. These tests will help prepare the team for Artemis II, which will be NASA's first crewed mission under the Artemis program. The Artemis II crew, consisting of NASA astronauts Reid Wiseman, Victor Glover, and Christina Koch, and Canadian Space Agency astronaut Jeremy Hansen, will participate in recovery testing at sea next year. For more information about the Artemis program, you can visit the NASA website.",
  "RelevantSources": [
    {
      "Link": "...",
      "SourceContentType": "application/pdf",
      "SourceName": "file5-NASA-news.pdf",
      "Partitions": [
        {
          "Text": "Skip to main content\nJul 28, 2023\nMEDIA ADVISORY M23-095\nNASA Invites Media to See Recovery Craft for\nArtemis Moon Mission\n(/sites/default/files/thumbnails/image/ksc-20230725-ph-fmx01_0003orig.jpg)\nAboard the USS John P. Murtha, NASA and Department of Defense personnel practice recovery operations for Artemis II in July. A\ncrew module test article is used to help verify the recovery team will be ready to recovery the Artemis II crew and the Orion spacecraft.\nCredits: NASA/Frank Michaux\nMedia are invited to see the new test version of NASA’s Orion spacecraft and the hardware teams will use\nto recover the capsule and astronauts upon their return from space during the Artemis II\n(http://www.nasa.gov/artemis-ii) mission. The event will take place at 11 a.m. PDT on Wednesday, Aug. 2,\nat Naval Base San Diego.\nPersonnel involved in recovery operations from NASA, the U.S. Navy, and the U.S. Air Force will be\navailable to speak with media.\nU.S. media interested in attending must RSVP by 4 p.m., Monday, July 31, to the Naval Base San Diego\nPublic Affairs (mailto:nbsd.pao@us.navy.mil) or 619-556-7359.\nOrion Spacecraft (/exploration/systems/orion/index.html)\nNASA Invites Media to See Recovery Craft for Artemis Moon Miss... https://www.nasa.gov/press-release/nasa-invites-media-to-see-recov...\n1 of 3 7/28/23, 4:51 PMTeams are currently conducting the first in a series of tests in the Pacific Ocean to demonstrate and\nevaluate the processes, procedures, and hardware for recovery operations (https://www.nasa.gov\n/exploration/systems/ground/index.html) for crewed Artemis missions. The tests will help prepare the\nteam for Artemis II, NASA’s first crewed mission under Artemis that will send four astronauts in Orion\naround the Moon to checkout systems ahead of future lunar missions.\nThe Artemis II crew – NASA astronauts Reid Wiseman, Victor Glover, and Christina Koch, and CSA\n(Canadian Space Agency) astronaut Jeremy Hansen – will participate in recovery testing at sea next year.\nFor more information about Artemis, visit:\nhttps://www.nasa.gov/artemis (https://www.nasa.gov/artemis)\n-end-\nRachel Kraft\nHeadquarters, Washington\n202-358-1100\nrachel.h.kraft@nasa.gov (mailto:rachel.h.kraft@nasa.gov)\nMadison Tuttle\nKennedy Space Center, Florida\n321-298-5868\nmadison.e.tuttle@nasa.gov (mailto:madison.e.tuttle@nasa.gov)\nLast Updated: Jul 28, 2023\nEditor: Claire O’Shea\nTags:  Artemis (/artemisprogram),Ground Systems (http://www.nasa.gov/exploration/systems/ground\n/index.html),Kennedy Space Center (/centers/kennedy/home/index.html),Moon to Mars (/topics/moon-to-\nmars/),Orion Spacecraft (/exploration/systems/orion/index.html)\nNASA Invites Media to See Recovery Craft for Artemis Moon Miss... https://www.nasa.gov/press-release/nasa-invites-media-to-see-recov...\n2 of 3 7/28/23, 4:51 PM",
          "Relevance": 0.8430657,
          "SizeInTokens": 863,
          "LastUpdate": "2023-08-01T08:15:02-07:00"
        }
      ]
    }
  ]
}

You can find a full example here.

Custom memory ingestion pipelines

On the other hand, if you need a custom data pipeline, you can also customize the steps, which will be handled by your custom business logic:

// Memory setup, e.g. how to calculate and where to store embeddings
var memoryBuilder = new KernelMemoryBuilder().WithOpenAIDefaults(Env.Var("OPENAI_API_KEY"));
memoryBuilder.Build();
var orchestrator = memoryBuilder.GetOrchestrator();

// Define custom .NET handlers
var step1 = new MyHandler1("step1", orchestrator);
var step2 = new MyHandler2("step2", orchestrator);
var step3 = new MyHandler3("step3", orchestrator);
await orchestrator.AddHandlerAsync(step1);
await orchestrator.AddHandlerAsync(step2);
await orchestrator.AddHandlerAsync(step3);

// Instantiate a custom pipeline
var pipeline = orchestrator
    .PrepareNewFileUploadPipeline("user-id-1", "mytest", new[] { "memory-collection" })
    .AddUploadFile("file1", "file1.docx", "file1.docx")
    .AddUploadFile("file2", "file2.pdf", "file2.pdf")
    .Then("step1")
    .Then("step2")
    .Then("step3")
    .Build();

// Execute in process, process all files with all the handlers
await orchestrator.RunPipelineAsync(pipeline);

Web API specs

The API schema is available at http://127.0.0.1:9001/swagger/index.html when running the service locally with OpenAPI enabled.

Examples and Tools

Examples

  1. Collection of Jupyter notebooks with various scenarios
  2. Using Kernel Memory web service to upload documents and answer questions
  3. Using KM Plugin for Semantic Kernel
  4. Importing files and asking question without running the service (serverless mode)
  5. Processing files with custom steps
  6. Upload files and ask questions from command line using curl
  7. Customizing RAG and summarization prompts
  8. Custom partitioning/text chunking options
  9. Using a custom embedding/vector generator
  10. Using custom LLMs
  11. Using LLama
  12. Summarizing documents
  13. Natural language to SQL examples
  14. Writing and using a custom ingestion handler
  15. Running a single asynchronous pipeline handler as a standalone service
  16. Test project linked to KM package from nuget.org
  17. Integrating Memory with ASP.NET applications and controllers
  18. Sample code showing how to extract text from files

Tools

  1. Curl script to upload files
  2. Curl script to ask questions
  3. Curl script to search documents
  4. Script to start Qdrant for development tasks
  5. Script to start RabbitMQ for development tasks
  6. .NET appsettings.json generator

Repository Guidance

This repository is a public resource designed to showcase best practices and efficient architecture for specific programming scenarios. Although bug fixes and secure, scalable enhancements are part of our focus, rigorous reviews and strategic considerations of the code are recommended before production use. Similar to the nature of AI development, the project will rapidly evolve. We warmly welcome you to participate in the development of Kernel Memory. Feel free to contribute opening GitHub Issues, sending us PRs, and joining our Discord community. Thank you and happy coding!

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (11)

Showing the top 5 NuGet packages that depend on Microsoft.KernelMemory.Core:

Package Downloads
Microsoft.KernelMemory.AI.LlamaSharp

Provide access to OpenAI LLM models in Kernel Memory to generate text

EachShow.AI

OpenAI, ChatGPT

VeeFriends.ShopifySync.WhatNot

Data as it pertains to VeeFriends.

Alkampfer.KernelMemory.Extensions

Added some extensions for Kernel Memory.

ManagedCode.KernelMemory.Playwright

Playwright for KernelMemory

GitHub repositories (5)

Showing the top 5 popular GitHub repositories that depend on Microsoft.KernelMemory.Core:

Repository Stars
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
microsoft/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
microsoft/teams-ai
SDK focused on building AI based applications and extensions for Microsoft Teams and other Bot Framework channels
lindexi/lindexi_gd
博客用到的代码
ktutak1337/Stellar-Chat
A versatile multi-modal chat application that enables users to develop custom agents, create images, leverage visual recognition, and engage in voice interactions. It integrates seamlessly with local LLMs and commercial models like OpenAI, Gemini, Perplexity, and Claude, and allows to converse with uploaded documents and websites.
Version Downloads Last updated
0.91.241101.1 3,188 11/1/2024
0.91.241031.1 1,044 10/31/2024
0.90.241021.1 7,382 10/22/2024
0.90.241020.3 547 10/20/2024
0.80.241017.2 2,333 10/17/2024
0.79.241014.2 1,296 10/14/2024
0.79.241014.1 161 10/14/2024
0.78.241007.1 3,089 10/8/2024
0.78.241005.1 443 10/6/2024
0.77.241004.1 213 10/5/2024
0.76.240930.3 4,022 9/30/2024
0.75.240924.1 5,101 9/24/2024
0.74.240919.1 2,943 9/19/2024
0.73.240906.1 22,985 9/7/2024
0.72.240904.1 2,633 9/5/2024
0.71.240820.1 10,996 8/21/2024
0.70.240803.1 21,516 8/3/2024
0.69.240727.1 8,410 7/27/2024
0.68.240722.1 3,017 7/22/2024
0.68.240716.1 1,935 7/16/2024
0.67.240712.1 1,673 7/12/2024
0.66.240709.1 5,233 7/9/2024
0.65.240620.1 30,239 6/21/2024
0.64.240619.1 738 6/20/2024
0.63.240618.1 2,567 6/18/2024
0.62.240605.1 20,313 6/5/2024
0.62.240604.1 497 6/4/2024
0.61.240524.1 9,972 5/24/2024
0.61.240519.2 9,962 5/19/2024
0.60.240517.1 230 5/18/2024
0.51.240513.2 6,853 5/13/2024
0.50.240504.7 4,679 5/4/2024
0.40.240501.1 741 5/1/2024
0.39.240427.1 5,543 4/28/2024
0.38.240425.1 1,226 4/25/2024
0.38.240423.1 1,079 4/24/2024
0.37.240420.2 1,580 4/21/2024
0.36.240416.1 14,016 4/16/2024
0.36.240415.2 1,601 4/16/2024
0.36.240415.1 326 4/15/2024
0.35.240412.2 1,234 4/12/2024
0.35.240321.1 24,431 3/21/2024
0.35.240318.1 17,715 3/18/2024
0.34.240313.1 7,485 3/13/2024
0.33.240312.1 543 3/12/2024
0.32.240308.1 1,769 3/8/2024
0.32.240307.3 614 3/7/2024
0.32.240307.2 387 3/7/2024
0.30.240227.1 18,961 2/28/2024
0.29.240219.2 7,409 2/20/2024
0.28.240212.1 3,723 2/13/2024
0.27.240207.1 1,363 2/7/2024
0.27.240205.2 3,085 2/6/2024
0.27.240205.1 241 2/5/2024
0.26.240121.1 16,374 1/22/2024
0.26.240116.2 2,391 1/16/2024
0.26.240115.4 496 1/16/2024
0.26.240104.1 3,747 1/5/2024
0.25.240103.1 271 1/4/2024
0.24.231228.5 1,540 12/29/2023
0.24.231228.4 153 12/29/2023
0.23.231224.1 9,364 12/24/2023
0.23.231221.1 786 12/22/2023
0.23.231219.1 2,645 12/20/2023
0.22.231217.1 182 12/18/2023
0.21.231214.1 265 12/15/2023
0.20.231212.1 888 12/13/2023
0.19.231211.1 1,071 12/11/2023