LocalAI.NET
1.0.1
dotnet add package LocalAI.NET --version 1.0.1
NuGet\Install-Package LocalAI.NET -Version 1.0.1
<PackageReference Include="LocalAI.NET" Version="1.0.1" />
paket add LocalAI.NET --version 1.0.1
#r "nuget: LocalAI.NET, 1.0.1"
// Install LocalAI.NET as a Cake Addin #addin nuget:?package=LocalAI.NET&version=1.0.1 // Install LocalAI.NET as a Cake Tool #tool nuget:?package=LocalAI.NET&version=1.0.1
LocalAI.NET (In Progress)
A .NET client library for interacting with local LLM providers like KoboldCpp, Ollama, LM Studio, and Text Generation WebUI. This library provides a simple, efficient, and cross-platform way to use local AI models in your .NET applications.
Alternative To
- OpenAI.NET SDK (for those who want local/offline alternatives)
- LLamaSharp (simpler API, multiple provider support)
- OllamaSharp (supports multiple providers beyond Ollama)
Features
- Unified API for multiple local AI providers:
- KoboldCpp (Native and OpenAI-compatible modes)
- Ollama
- LM Studio
- Text Generation WebUI
- Streaming and non-streaming text completion
- Progress tracking and error handling
- Configurable retry policies and circuit breakers
- Full async/await support
- Comprehensive logging capabilities
- Built-in fault tolerance
Installation
Install LocalAI.NET via NuGet:
dotnet add package LocalAI.NET
Quick Start
using LocalAI.NET.Client;
using LocalAI.NET.Models.Configuration;
// Create client with KoboldCpp provider
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:5000",
ProviderOptions = new KoboldCppNativeOptions
{
ContextSize = 2048,
UseGpu = true,
RepetitionPenalty = 1.1f
}
};
using var client = new LocalAIClient(options);
// Generate text completion
string response = await client.CompleteAsync("Write a short story about a robot:");
// Stream completion tokens
await foreach (var token in client.StreamCompletionAsync("Once upon a time..."))
{
Console.Write(token);
}
// List available models
var models = await client.GetAvailableModelsAsync();
foreach (var model in models)
{
Console.WriteLine($"Model: {model.Name} (Provider: {model.Provider})");
Console.WriteLine($"Context Length: {model.Capabilities.MaxContextLength}");
}
Provider Configuration
KoboldCpp (Native)
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:5000",
ProviderOptions = new KoboldCppNativeOptions
{
ContextSize = 2048,
UseGpu = true,
RepetitionPenalty = 1.1f,
RepetitionPenaltyRange = 320,
TrimStop = true,
Mirostat = new MirostatSettings
{
Mode = 2,
Tau = 5.0f,
Eta = 0.1f
}
}
};
KoboldCpp (OpenAI-compatible)
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:5000",
ProviderOptions = new KoboldCppOpenAiOptions
{
ContextSize = 2048,
UseGpu = true,
ModelName = "koboldcpp",
UseChatCompletions = true
}
};
Ollama
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:11434",
ProviderOptions = new OllamaOptions
{
ConcurrentRequests = 1
}
};
LM Studio
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:1234",
ProviderOptions = new LMStudioOptions
{
UseOpenAIEndpoint = true
}
};
Text Generation WebUI
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:7860",
ProviderOptions = new TextGenWebOptions
{
UseOpenAIEndpoint = true
}
};
Completion Options
var options = new CompletionOptions
{
ModelName = "wizardLM", // Optional model name
MaxTokens = 200, // Max tokens to generate
Temperature = 0.7f, // Randomness (0.0-1.0)
TopP = 0.9f, // Nucleus sampling threshold
StopSequences = new[] { "\n" } // Sequences that stop generation
};
string response = await client.CompleteAsync("Your prompt here", options);
Progress Tracking
client.OnProgress += (progress) =>
{
switch (progress.State)
{
case LocalAIProgressState.Starting:
Console.WriteLine("Starting completion...");
break;
case LocalAIProgressState.Processing:
Console.WriteLine($"Processing: {progress.Message}");
break;
case LocalAIProgressState.Streaming:
Console.WriteLine("Receiving tokens...");
break;
case LocalAIProgressState.Complete:
Console.WriteLine("Completion finished!");
break;
case LocalAIProgressState.Failed:
Console.WriteLine($"Error: {progress.Message}");
break;
}
};
Error Handling
try
{
var response = await client.CompleteAsync("Test prompt");
}
catch (LocalAIException ex)
{
Console.WriteLine($"LocalAI API error: {ex.Message}");
if (ex.StatusCode.HasValue)
{
Console.WriteLine($"Status code: {ex.StatusCode}");
}
if (ex.Provider != null)
{
Console.WriteLine($"Provider: {ex.Provider}");
}
}
catch (Exception ex)
{
Console.WriteLine($"General error: {ex.Message}");
}
Advanced Configuration
var options = new LocalAIOptions
{
BaseUrl = "http://localhost:5000",
ApiKey = "optional_api_key",
Timeout = TimeSpan.FromMinutes(2),
MaxRetryAttempts = 3,
RetryDelay = TimeSpan.FromSeconds(2),
Logger = loggerInstance,
JsonSettings = new JsonSerializerSettings(),
ProviderOptions = new KoboldCppNativeOptions()
};
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Please see CONTRIBUTING.md for details on:
- How to publish to NuGet
- Development guidelines
- Code style
- Testing requirements
- Pull request process
Support
For issues and feature requests, please use the GitHub issues page.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
-
net6.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
-
net7.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
-
net8.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.