SpongeEngine.GPT4AllSharp 1.1.1

dotnet add package SpongeEngine.GPT4AllSharp --version 1.1.1                
NuGet\Install-Package SpongeEngine.GPT4AllSharp -Version 1.1.1                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="SpongeEngine.GPT4AllSharp" Version="1.1.1" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add SpongeEngine.GPT4AllSharp --version 1.1.1                
#r "nuget: SpongeEngine.GPT4AllSharp, 1.1.1"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install SpongeEngine.GPT4AllSharp as a Cake Addin
#addin nuget:?package=SpongeEngine.GPT4AllSharp&version=1.1.1

// Install SpongeEngine.GPT4AllSharp as a Cake Tool
#tool nuget:?package=SpongeEngine.GPT4AllSharp&version=1.1.1                

GPT4AllSharp

NuGet NuGet Downloads Tests License .NET

C# client for interacting with GPT4All through its OpenAI-compatible endpoints.

Features

  • Complete support for GPT4All's native API
  • OpenAI-compatible API endpoint support
  • Streaming text generation
  • Comprehensive configuration options
  • Built-in error handling and logging
  • Cross-platform compatibility
  • Full async/await support

📦 View Package on NuGet

Installation

Install via NuGet:

dotnet add package SpongeEngine.GPT4AllSharp

Quick Start

Using Native API

using LocalAI.NET.KoboldCpp.Client;
using LocalAI.NET.KoboldCpp.Models;

// Configure the client
var options = new KoboldCppOptions
{
    BaseUrl = "http://localhost:5001",
    UseGpu = true,
    ContextSize = 2048
};

// Create client instance
using var client = new KoboldCppClient(options);

// Generate completion
var request = new KoboldCppRequest
{
    Prompt = "Write a short story about a robot:",
    MaxLength = 200,
    Temperature = 0.7f,
    TopP = 0.9f
};

var response = await client.GenerateAsync(request);
Console.WriteLine(response.Results[0].Text);

// Stream completion
await foreach (var token in client.GenerateStreamAsync(request))
{
    Console.Write(token);
}

Using OpenAI-Compatible API

var options = new KoboldCppOptions
{
    BaseUrl = "http://localhost:5001",
    UseOpenAiApi = true
};

using var client = new KoboldCppClient(options);

// Simple completion
string response = await client.CompleteAsync(
    "Write a short story about:",
    new CompletionOptions
    {
        MaxTokens = 200,
        Temperature = 0.7f,
        TopP = 0.9f
    });

// Stream completion
await foreach (var token in client.StreamCompletionAsync(
    "Once upon a time...",
    new CompletionOptions { MaxTokens = 200 }))
{
    Console.Write(token);
}

Configuration Options

Basic Options

var options = new KoboldCppOptions
{
    BaseUrl = "http://localhost:5001",    // KoboldCpp server URL
    ApiKey = "optional_api_key",          // Optional API key
    TimeoutSeconds = 600,                 // Request timeout
    ContextSize = 2048,                   // Maximum context size
    UseGpu = true,                        // Enable GPU acceleration
    UseOpenAiApi = false                  // Use OpenAI-compatible API
};

Advanced Generation Parameters

var request = new KoboldCppRequest
{
    Prompt = "Your prompt here",
    MaxLength = 200,                      // Maximum tokens to generate
    MaxContextLength = 2048,              // Maximum context length
    Temperature = 0.7f,                   // Randomness (0.0-1.0)
    TopP = 0.9f,                          // Nucleus sampling threshold
    TopK = 40,                            // Top-K sampling
    TopA = 0.0f,                          // Top-A sampling
    Typical = 1.0f,                       // Typical sampling
    Tfs = 1.0f,                          // Tail-free sampling
    RepetitionPenalty = 1.1f,            // Repetition penalty
    RepetitionPenaltyRange = 64,         // Penalty range
    StopSequences = new List<string> { "\n" },  // Stop sequences
    Stream = false,                       // Enable streaming
    TrimStop = true,                     // Trim stop sequences
    MirostatMode = 0,                    // Mirostat sampling mode
    MirostatTau = 5.0f,                  // Mirostat target entropy
    MirostatEta = 0.1f                   // Mirostat learning rate
};

Error Handling

try
{
    var response = await client.GenerateAsync(request);
}
catch (KoboldCppException ex)
{
    Console.WriteLine($"KoboldCpp error: {ex.Message}");
    Console.WriteLine($"Provider: {ex.Provider}");
    if (ex.StatusCode.HasValue)
    {
        Console.WriteLine($"Status code: {ex.StatusCode}");
    }
    if (ex.ResponseContent != null)
    {
        Console.WriteLine($"Response content: {ex.ResponseContent}");
    }
}
catch (Exception ex)
{
    Console.WriteLine($"General error: {ex.Message}");
}

Logging

The client supports Microsoft.Extensions.Logging:

ILogger logger = LoggerFactory
    .Create(builder => builder
        .AddConsole()
        .SetMinimumLevel(LogLevel.Debug))
    .CreateLogger<KoboldCppClient>();

var client = new KoboldCppClient(options, logger);

JSON Serialization

Custom JSON settings can be provided:

var jsonSettings = new JsonSerializerSettings
{
    NullValueHandling = NullValueHandling.Ignore,
    DefaultValueHandling = DefaultValueHandling.Ignore
};

var client = new KoboldCppClient(options, jsonSettings: jsonSettings);

Testing

The library includes both unit and integration tests. Integration tests require a running KoboldCpp server.

To run the tests:

dotnet test

To configure the test environment:

// Set environment variables for testing
Environment.SetEnvironmentVariable("KOBOLDCPP_BASE_URL", "http://localhost:5001");
Environment.SetEnvironmentVariable("KOBOLDCPP_OPENAI_BASE_URL", "http://localhost:5001/v1");

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

For issues and feature requests, please use the GitHub issues page.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.1.1 57 1/4/2025