LocalAI.NET.Oobabooga
1.0.2
dotnet add package LocalAI.NET.Oobabooga --version 1.0.2
NuGet\Install-Package LocalAI.NET.Oobabooga -Version 1.0.2
<PackageReference Include="LocalAI.NET.Oobabooga" Version="1.0.2" />
paket add LocalAI.NET.Oobabooga --version 1.0.2
#r "nuget: LocalAI.NET.Oobabooga, 1.0.2"
// Install LocalAI.NET.Oobabooga as a Cake Addin #addin nuget:?package=LocalAI.NET.Oobabooga&version=1.0.2 // Install LocalAI.NET.Oobabooga as a Cake Tool #tool nuget:?package=LocalAI.NET.Oobabooga&version=1.0.2
LocalAI.NET.Oobabooga
A .NET client library for interacting with Oobabooga's text-generation-webui through its OpenAI-compatible API endpoints. This library provides a simple, efficient way to use local LLMs in your .NET applications.
This package serves as the Oobabooga integration layer for the LocalAI.NET ecosystem.
Features
- OpenAI-compatible API support
- Text completion and chat completion
- Streaming responses support
- Character templates and instruction formats
- Comprehensive configuration options
- Built-in error handling and logging
- Cross-platform compatibility
- Full async/await support
Installation
Install via NuGet:
dotnet add package LocalAI.NET.Oobabooga
Quick Start
using LocalAI.NET.Oobabooga.Client;
using LocalAI.NET.Oobabooga.Models.Common;
using LocalAI.NET.Oobabooga.Models.Chat;
// Configure the client
var options = new OobaboogaOptions
{
BaseUrl = "http://localhost:5000", // Default port for text-generation-webui
TimeoutSeconds = 120
};
// Create client instance
using var client = new OobaboogaClient(options);
// Simple completion
var response = await client.CompleteAsync(
"Write a short story about a robot:",
new OobaboogaCompletionOptions
{
MaxTokens = 200,
Temperature = 0.7f,
TopP = 0.9f
});
Console.WriteLine(response);
// Chat completion
var messages = new List<OobaboogaChatMessage>
{
new() { Role = "user", Content = "Write a poem about coding" }
};
var chatResponse = await client.ChatCompleteAsync(
messages,
new OobaboogaChatCompletionOptions
{
Mode = "instruct",
InstructionTemplate = "Alpaca",
MaxTokens = 200
});
Console.WriteLine(chatResponse.Choices[0].Message.Content);
// Stream chat completion
await foreach (var message in client.StreamChatCompletionAsync(messages))
{
Console.Write(message.Content);
}
Configuration Options
Basic Options
var options = new OobaboogaOptions
{
BaseUrl = "http://localhost:5000", // text-generation-webui server URL
ApiKey = "optional_api_key", // Optional API key for authentication
TimeoutSeconds = 120 // Request timeout
};
Chat Completion Options
var options = new OobaboogaChatCompletionOptions
{
ModelName = "optional_model_name", // Specific model to use
MaxTokens = 200, // Maximum tokens to generate
Temperature = 0.7f, // Randomness (0.0-1.0)
TopP = 0.9f, // Nucleus sampling threshold
StopSequences = new[] { "\n" }, // Stop sequences
Mode = "chat", // "chat" or "instruct"
InstructionTemplate = "Alpaca", // Template for instruction format
Character = "Assistant" // Character template to use
};
Text Completion Options
var options = new OobaboogaCompletionOptions
{
ModelName = "optional_model_name",
MaxTokens = 200,
Temperature = 0.7f,
TopP = 0.9f,
StopSequences = new[] { "\n" }
};
Error Handling
try
{
var response = await client.ChatCompleteAsync(messages, options);
}
catch (OobaboogaException ex)
{
Console.WriteLine($"Oobabooga error: {ex.Message}");
Console.WriteLine($"Provider: {ex.Provider}");
Console.WriteLine($"Status code: {ex.StatusCode}");
Console.WriteLine($"Response content: {ex.ResponseContent}");
}
catch (Exception ex)
{
Console.WriteLine($"General error: {ex.Message}");
}
Logging
The client supports Microsoft.Extensions.Logging:
ILogger logger = LoggerFactory
.Create(builder => builder
.AddConsole()
.SetMinimumLevel(LogLevel.Debug))
.CreateLogger<OobaboogaClient>();
var client = new OobaboogaClient(options, logger);
JSON Serialization
Custom JSON settings can be provided:
var jsonSettings = new JsonSerializerSettings
{
NullValueHandling = NullValueHandling.Ignore
};
var client = new OobaboogaClient(options, logger, jsonSettings);
Testing
The library includes both unit and integration tests. Integration tests require a running text-generation-webui server.
To run the tests:
dotnet test
Configure test environment:
Environment.SetEnvironmentVariable("OOBABOOGA_BASE_URL", "http://localhost:5000");
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Support
For issues and feature requests, please use the GitHub issues page.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
-
net6.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- OllamaSharp (>= 4.0.11)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
-
net7.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- OllamaSharp (>= 4.0.11)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
-
net8.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.0)
- Newtonsoft.Json (>= 13.0.3)
- OllamaSharp (>= 4.0.11)
- Polly (>= 8.5.0)
- System.Linq.Async (>= 6.0.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.