Kai.OllamaSharp
1.1.0.9-b
dotnet add package Kai.OllamaSharp --version 1.1.0.9-b
NuGet\Install-Package Kai.OllamaSharp -Version 1.1.0.9-b
<PackageReference Include="Kai.OllamaSharp" Version="1.1.0.9-b" />
paket add Kai.OllamaSharp --version 1.1.0.9-b
#r "nuget: Kai.OllamaSharp, 1.1.0.9-b"
// Install Kai.OllamaSharp as a Cake Addin #addin nuget:?package=Kai.OllamaSharp&version=1.1.0.9-b&prerelease // Install Kai.OllamaSharp as a Cake Tool #tool nuget:?package=Kai.OllamaSharp&version=1.1.0.9-b&prerelease
OllamaSharp 🦙
OllamaSharp is a .NET binding for the Ollama API, making it easy to interact with Ollama using your favorite .NET languages.
Features
- Intuitive API client: Set up and interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all Ollama API endpoints including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- API Console: A ready-to-use API console to chat and manage your Ollama host remotely
Usage
OllamaSharp wraps every Ollama API endpoint in awaitable methods that fully support response streaming.
The follow list shows a few examples to get a glimpse on how easy it is to use. The list is not complete.
Initializing
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama2";
Listing all models that are available locally
var models = await ollama.ListLocalModels();
Pulling a model and reporting progress
await ollama.PullModel("mistral", status => Console.WriteLine($"({status.Percent}%) {status.Status}"));
Streaming a completion directly into the console
// keep reusing the context to keep the chat topic going
ConversationContext context = null;
context = await ollama.StreamCompletion("How are you today?", context, stream => Console.Write(stream.Response));
Building interactive chats
// uses the /chat api from Ollama 0.1.14
// messages including their roles will automatically be tracked within the chat object
var chat = ollama.Chat(stream => Console.WriteLine(stream.Message?.Content ?? ""));
while (true)
{
var message = Console.ReadLine();
await chat.Send(message);
}
Api Console
This project ships a full-featured demo console for all endpoints the Ollama API is exposing.
This is not only a great resource to learn about OllamaSharp, it can also be used to manage and chat with the Ollama host remotely. Image chat is supported for multi modal models.
Credits
Icon and name were reused from the amazing Ollama project.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- Microsoft.AspNetCore.Components.WebAssembly (>= 8.0.6)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.