Forge.OpenAI 1.5.2

dotnet add package Forge.OpenAI --version 1.5.2                
NuGet\Install-Package Forge.OpenAI -Version 1.5.2                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Forge.OpenAI" Version="1.5.2" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Forge.OpenAI --version 1.5.2                
#r "nuget: Forge.OpenAI, 1.5.2"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Forge.OpenAI as a Cake Addin
#addin nuget:?package=Forge.OpenAI&version=1.5.2

// Install Forge.OpenAI as a Cake Tool
#tool nuget:?package=Forge.OpenAI&version=1.5.2                

Forge.OpenAI is a C#.NET client library for OpenAI API, using GPT-4, 3.5 and 3, DALL-E 3, DALL-E 2, Whisper, etc.

OpenAI API client library for .NET. It supports OpenAI and Azure-OpenAI APIs. This library was developed for public usage and it is free to use. Supported .NET versions:

x >= v4.6.1,

x >= Netstandard 2.0,

x >= dotNetCore 3.1,

.NET 6.0,

.NET 7.0

.NET 8.0

Works with Blazor WebAssembly and Blazor Server.



To install the package add the following line to you csproj file replacing x.x.x with the latest version number:

<PackageReference Include="Forge.OpenAI" Version="x.x.x" />

You can also install via the .NET CLI with the following command:

dotnet add package Forge.OpenAI

If you're using Visual Studio you can also install via the built in NuGet package manager.


You should create an ApiKey to work with the OpenAI API.

If you do not have an account at OpenAI, create one here:

Than navigate to:

By default, this library uses Microsoft Dependency Injection, however it is not necessary.

You can register the client services with the service collection in your Startup.cs / Program.cs file in your application.

public void ConfigureServices(IServiceCollection services)
    services.AddForgeOpenAI(options => {
        options.AuthenticationInfo = Configuration["OpenAI:ApiKey"]!;

Or in your Program.cs file.

public static async Task Main(string[] args)
    var builder = WebAssemblyHostBuilder.CreateDefault(args);

    builder.Services.AddForgeOpenAI(options => {
        options.AuthenticationInfo = builder.Configuration["OpenAI:ApiKey"]!;

    await builder.Build().RunAsync();


public static async Task Main(string[] args)
    using var host = Host.CreateDefaultBuilder(args)
        .ConfigureServices((builder, services) =>
            services.AddForgeOpenAI(options => {
                options.AuthenticationInfo = builder.Configuration["OpenAI:ApiKey"]!;

You should provide your OpenAI API key and optionally your organization to boot up the service. If you do not provide it in the configuration, service automatically lookup the necessary information in your environment variables, in a Json file (.openai) or in an environment file (.env).

Example for environment variables:


ORGANIZATION key checked for the organzation

Example for Json file:

{ "apikey": "your_api_key", "organization": "organization_id" }

Environment file must contains key/value pairs in this format {key}={value}

For the 'key', use one of the same value which described in Environment Variables above.

Example for environment file:




OpenAI and the dependent services require OpenAIOptions, which can be provided manually or it will happened, if you use dependency injection. If you need to use multiple OpenAI service instances at the same time, you should provide this options individually with different settings and authentication credentials.

In the options there are many Uri settings, which was not touched normally. The most important option is the AuthenticationInfo property, which contains the ApiKey and and Organization Id.

Also, there is an additional option, called HttpMessageHandlerFactory, which constructs the HttpMessageHandler for the HttpClient in some special cases, for example, if you want to override some behavior of the HttpClient.

There is a built-in logging feature, just for testing and debugging purposes, called LogRequestsAndResponses, which persists all of requests and responses in a folder (LogRequestsAndResponsesFolder). With this feature, you can check the low level messages. I do not recommend to use it in production environment.


If you would like to learn more about the API capabilities, please visit If you need to generate an API key, please visit:

I have created a playground, which is part of this solution. It covers all of the features, which this library provides. Feel free to run through these examples and play with the settings.

Also here is the OpenAI playground, where you can also find examples about the usage:


Setup the service with Azure-OpenAI provider, you need to specify the name of your Azure OpenAI resource as well as your model deployment id.

Prerequisites: Documentation:


Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 is compatible. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 is compatible.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.5.2 425 6/13/2024
1.4.11 431 5/21/2024
1.4.8 71 5/14/2024
1.4.7 121 5/13/2024
1.4.6 179 5/12/2024
1.4.5 61,797 5/4/2024
1.4.4 19,491 4/27/2024
1.4.3 83 4/26/2024
1.4.2 95 4/26/2024
1.3.7 171 4/21/2024
1.3.6 70,968 3/22/2024
1.3.0 102,575 2/18/2024
1.2.0 28,154 12/10/2023
1.1.7 10,184 12/2/2023
1.1.6 111,301 10/13/2023
1.1.5 277,472 5/17/2023
1.1.4 23,476 5/2/2023
1.1.3 30,771 4/30/2023
1.1.2 15,295 4/16/2023
1.0.3 5,157 3/12/2023
1.0.2 1,064 3/10/2023
1.0.1 51,055 2/19/2023
1.0.0 1,030 2/16/2023

v1.5.2 - Fixing JsonlManager.Load in .NET 4.x, where the null value is not allowed for the StreamReader
v1.5.1 - Fixing JsonlManager.Save in .NET 4.x, where the null value is not allowed for the StreamWriter
v1.5.0 - Batch, run async calls support, vector store, vectore store files, vector store file batch supported now. ChatCompletion stream options issue fixed, FuneTuningJob API changes implemented, FineTuningJob checkpoints support implemented
v1.4.11 - ChatCompletion ChatMessage constructor issue fixed
v1.4.10 - ChatCompletion ChatMessage missing JsonConstructor
v1.4.9 - ChatCompletionRequest changes implemented, MessageContent class added to ChatMessage class
v1.4.8 - GPT-4o model added, following changes in OpenAI API, added missing properties, new models
v1.4.7 - Messages also can be a list of MessageContent, not just a string
v1.4.6 - Thread message content now can be a list of MessageContent, not just a string
v1.4.5 - Fix typo in known model type "Gpr_4_turbo"
v1.4.4 - Improved service factory methods and Playground examples
v1.4.3 - Improved service factory methods
v1.4.2 - Fix issues
v1.4.1 - Fix issue in MessageResponseBase, duplicated status field and wrong "incomplete_details" field. Constants updated in Tool class.
v1.4.0 - New models, properties, bugfixes, supporting v2 of assistant, run, messages, threads
v1.3.8 - Configurable assistant header values, bugfixes
v1.3.7 - Following changes in OpenAI API, added missing properties, new models
v1.3.6 - Added missing properties to RunResponse class
v1.3.5 - Fixed an URL issue in RunService class
v1.3.4 - Fixed a bug when OpenAIService created manually
v1.3.3 - Following changes in OpenAI API, addition header data included into the requests, fixed
v1.3.2 - OpenAIService class second constructor does not initialize RunService and RunStepService services, fixed
v1.3.1 - ChatTool invalid function data type bug fixed
v1.3.0 - Assistant, threads, messages and run API (beta) support, bugfixes
v1.2.0 - FineTuning Job API support, existing APIs updated to the latest versions
v1.1.7 - .NET 8 support and a fix for the ImageService, ImageEditRequest issue,
v1.1.6 - Usage is always null bug fixed in ChatCompletionResponse and TextEditResponse
v1.1.5 - Name field for the chat message. PromptLossWeight field of FineTuneCreateRequest is not mandatory (nullable). Azure endpoint default API version changed.
v1.1.4 - Added support for IHttpClientFactory. Now short-lived, long-lived and custom HttpClient instances can be used. Last one is useful for MAUI Android clients.
v1.1.3 - Added optimizations for .NET 7