SpongeEngine.SpongeLLM 0.0.5

There is a newer version of this package available.
See the version list below for details.
dotnet add package SpongeEngine.SpongeLLM --version 0.0.5                
NuGet\Install-Package SpongeEngine.SpongeLLM -Version 0.0.5                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="SpongeEngine.SpongeLLM" Version="0.0.5" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add SpongeEngine.SpongeLLM --version 0.0.5                
#r "nuget: SpongeEngine.SpongeLLM, 0.0.5"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install SpongeEngine.SpongeLLM as a Cake Addin
#addin nuget:?package=SpongeEngine.SpongeLLM&version=0.0.5

// Install SpongeEngine.SpongeLLM as a Cake Tool
#tool nuget:?package=SpongeEngine.SpongeLLM&version=0.0.5                

SpongeLLM

NuGet NuGet Downloads Tests License .NET

A unified C# client for interacting with various LLM providers through a consistent interface.

Key Features

  • Unified Interface: Write code once, switch providers seamlessly
  • Multiple Provider Support: Works with KoboldCpp, Ollama, LM Studio, and Text Generation WebUI
  • .NET Modern Features: Full async/await support with streaming capabilities
  • Cross-Platform: Runs on any platform supporting .NET 6.0, 7.0, or 8.0
  • Production Ready: Includes logging, resilience patterns, and comprehensive error handling

Installation

Install via NuGet:

dotnet add package SpongeEngine.SpongeLLM

Quick Start

using SpongeEngine.SpongeLLM;
using SpongeEngine.KoboldSharp;
using SpongeEngine.SpongeLLM.Core.Models;

// Initialize with your preferred provider
var options = new KoboldSharpClientOptions
{
    BaseUrl = "http://localhost:5000"
};

var client = new SpongeLLMClient(options);

// Check if service is available
bool isAvailable = await client.IsAvailableAsync();

// Basic text completion
var request = new TextCompletionRequest
{
    Prompt = "Write a short story about a robot:",
    MaxTokens = 100
};

var result = await client.CompleteTextAsync(request);
Console.WriteLine(result.Text);

// Stream completion tokens
await foreach (var token in client.CompleteTextStreamAsync(request))
{
    Console.Write(token.Text);
}

Supported Providers

The library includes built-in support for:

  • KoboldCpp: Local inference with various open-source models
  • Ollama: Easy deployment of Llama 2, Code Llama, and other models
  • LM Studio: User-friendly interface for running local models
  • Text Generation WebUI: Popular interface for model deployment and inference

Provider Configuration

KoboldCpp

var options = new KoboldSharpClientOptions
{
    BaseUrl = "http://localhost:5000",
    UseGpu = true,
    MaxContextLength = 2048
};

LM Studio

var options = new LMStudioClientOptions
{
    BaseUrl = "http://localhost:1234",
    UseOpenAICompat = true
};

Text Generation WebUI (Oobabooga)

var options = new OobaboogaSharpClientOptions
{
    BaseUrl = "http://localhost:7860",
    UseOpenAICompat = true
};

Completion Options

Configure text completion requests with various parameters:

var request = new TextCompletionRequest
{
    Prompt = "Once upon a time",
    MaxTokens = 200,
    Temperature = 0.7f,
    TopP = 0.9f,
    StopSequences = new[] { "\n\n", "THE END" }
};

Error Handling

The client includes comprehensive error handling:

try
{
    var result = await client.CompleteTextAsync(request);
}
catch (Exception ex) when (ex is NotSupportedException)
{
    Console.WriteLine("This provider doesn't support text completion");
}
catch (Exception ex)
{
    Console.WriteLine($"Error during completion: {ex.Message}");
}

Advanced Configuration

SpongeLLM supports additional configuration through provider-specific options:

var options = new KoboldSharpClientOptions
{
    BaseUrl = "http://localhost:5000",
    Timeout = TimeSpan.FromMinutes(2),
    RetryCount = 3,
    Logger = loggerInstance,
    // Additional provider-specific settings
};

Architecture

SpongeLLM uses a modular architecture based on interfaces:

  • ITextCompletion: Basic text completion capabilities
  • IStreamableTextCompletion: Streaming completion support
  • IIsAvailable: Service availability checking

Each provider implements these interfaces as needed, allowing for consistent interaction regardless of the underlying service.

Contributing

Contributions are welcome! Please see our Contributing Guidelines for details on:

  • Development setup
  • Coding standards
  • Testing requirements
  • Pull request process

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For issues and feature requests, please use the GitHub issues page.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.1.0 35 1/31/2025
0.0.6 41 1/31/2025
0.0.5 44 1/30/2025
0.0.3 68 1/26/2025
0.0.1 72 1/26/2025