SpongeEngine.SpongeLLM
0.1.0
dotnet add package SpongeEngine.SpongeLLM --version 0.1.0
NuGet\Install-Package SpongeEngine.SpongeLLM -Version 0.1.0
<PackageReference Include="SpongeEngine.SpongeLLM" Version="0.1.0" />
paket add SpongeEngine.SpongeLLM --version 0.1.0
#r "nuget: SpongeEngine.SpongeLLM, 0.1.0"
// Install SpongeEngine.SpongeLLM as a Cake Addin #addin nuget:?package=SpongeEngine.SpongeLLM&version=0.1.0 // Install SpongeEngine.SpongeLLM as a Cake Tool #tool nuget:?package=SpongeEngine.SpongeLLM&version=0.1.0
SpongeLLM
A unified C# client for interacting with various LLM providers through a consistent interface.
Key Features
- Unified Interface: Write code once, switch providers seamlessly
- Multiple Provider Support: Works with KoboldCpp, Ollama, LM Studio, and Text Generation WebUI
- .NET Modern Features: Full async/await support with streaming capabilities
- Cross-Platform: Runs on any platform supporting .NET 6.0, 7.0, or 8.0
- Production Ready: Includes logging, resilience patterns, and comprehensive error handling
Installation
Install via NuGet:
dotnet add package SpongeEngine.SpongeLLM
Quick Start
using SpongeEngine.SpongeLLM;
using SpongeEngine.KoboldSharp;
using SpongeEngine.SpongeLLM.Core.Models;
// Initialize with your preferred provider
var options = new KoboldSharpClientOptions
{
BaseUrl = "http://localhost:5000"
};
var client = new SpongeLLMClient(options);
// Check if service is available
bool isAvailable = await client.IsAvailableAsync();
// Basic text completion
var request = new TextCompletionRequest
{
Prompt = "Write a short story about a robot:",
MaxTokens = 100
};
var result = await client.CompleteTextAsync(request);
Console.WriteLine(result.Text);
// Stream completion tokens
await foreach (var token in client.CompleteTextStreamAsync(request))
{
Console.Write(token.Text);
}
Supported Providers
The library includes built-in support for:
- KoboldCpp: Local inference with various open-source models
- Ollama: Easy deployment of Llama 2, Code Llama, and other models
- LM Studio: User-friendly interface for running local models
- Text Generation WebUI: Popular interface for model deployment and inference
Provider Configuration
KoboldCpp
var options = new KoboldSharpClientOptions
{
BaseUrl = "http://localhost:5000",
UseGpu = true,
MaxContextLength = 2048
};
LM Studio
var options = new LMStudioClientOptions
{
BaseUrl = "http://localhost:1234",
UseOpenAICompat = true
};
Text Generation WebUI (Oobabooga)
var options = new OobaboogaSharpClientOptions
{
BaseUrl = "http://localhost:7860",
UseOpenAICompat = true
};
Completion Options
Configure text completion requests with various parameters:
var request = new TextCompletionRequest
{
Prompt = "Once upon a time",
MaxTokens = 200,
Temperature = 0.7f,
TopP = 0.9f,
StopSequences = new[] { "\n\n", "THE END" }
};
Error Handling
The client includes comprehensive error handling:
try
{
var result = await client.CompleteTextAsync(request);
}
catch (Exception ex) when (ex is NotSupportedException)
{
Console.WriteLine("This provider doesn't support text completion");
}
catch (Exception ex)
{
Console.WriteLine($"Error during completion: {ex.Message}");
}
Advanced Configuration
SpongeLLM supports additional configuration through provider-specific options:
var options = new KoboldSharpClientOptions
{
BaseUrl = "http://localhost:5000",
Timeout = TimeSpan.FromMinutes(2),
RetryCount = 3,
Logger = loggerInstance,
// Additional provider-specific settings
};
Architecture
SpongeLLM uses a modular architecture based on interfaces:
ITextCompletion
: Basic text completion capabilitiesIStreamableTextCompletion
: Streaming completion supportIIsAvailable
: Service availability checking
Each provider implements these interfaces as needed, allowing for consistent interaction regardless of the underlying service.
Contributing
Contributions are welcome! Please see our Contributing Guidelines for details on:
- Development setup
- Coding standards
- Testing requirements
- Pull request process
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For issues and feature requests, please use the GitHub issues page.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
-
net6.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.KoboldSharp (>= 1.82.4.5)
- SpongeEngine.LMStudioSharp (>= 0.3.9.1)
- SpongeEngine.OobaboogaSharp (>= 2.4.0.1)
- SpongeEngine.SpongeLLM.Core (>= 0.1.0)
- System.Linq.Async (>= 6.0.1)
-
net7.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.KoboldSharp (>= 1.82.4.5)
- SpongeEngine.LMStudioSharp (>= 0.3.9.1)
- SpongeEngine.OobaboogaSharp (>= 2.4.0.1)
- SpongeEngine.SpongeLLM.Core (>= 0.1.0)
- System.Linq.Async (>= 6.0.1)
-
net8.0
- Microsoft.Extensions.Caching.Memory (>= 9.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.1)
- Polly (>= 8.5.1)
- SpongeEngine.KoboldSharp (>= 1.82.4.5)
- SpongeEngine.LMStudioSharp (>= 0.3.9.1)
- SpongeEngine.OobaboogaSharp (>= 2.4.0.1)
- SpongeEngine.SpongeLLM.Core (>= 0.1.0)
- System.Linq.Async (>= 6.0.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.