Betalgo.OpenAI 8.1.0

There is a newer version of this package available.
See the version list below for details.
dotnet add package Betalgo.OpenAI --version 8.1.0
NuGet\Install-Package Betalgo.OpenAI -Version 8.1.0
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Betalgo.OpenAI" Version="8.1.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Betalgo.OpenAI --version 8.1.0
#r "nuget: Betalgo.OpenAI, 8.1.0"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Betalgo.OpenAI as a Cake Addin
#addin nuget:?package=Betalgo.OpenAI&version=8.1.0

// Install Betalgo.OpenAI as a Cake Tool
#tool nuget:?package=Betalgo.OpenAI&version=8.1.0

Dotnet SDK for OpenAI ChatGPT, Whisper, GPT-4 and DALL·E

Betalgo.OpenAI

Install-Package Betalgo.OpenAI

Dotnet SDK for OpenAI
Unofficial. OpenAI doesn't have any official .Net SDK.

Checkout the wiki page:

https://github.com/betalgo/openai/wiki
Betalgo.OpenAI: Static Badge
Betalgo.OpenAI.Utilities: Static Badge

Checkout new experimental utilities library:

Betalgo.OpenAI.Utilities

Install-Package Betalgo.OpenAI.Utilities

Maintenance of this project is made possible by all the bug reporters, contributors and sponsors.
💖 Sponsors:
@betalgo, Laser Cat Eyes

@tylerje @oferavnery @MayDay-wpf @AnukarOP @Removable

Features

For changelogs please go to end of the document.

Sample Usages

The repository contains a sample project named OpenAI.Playground that you can refer to for a better understanding of how the library works. However, please exercise caution while experimenting with it, as some of the test methods may result in unintended consequences such as file deletion or fine tuning.

!! It is highly recommended that you use a separate account instead of your primary account while using the playground. This is because some test methods may add or delete your files and models, which could potentially cause unwanted issues. !!

Your API Key comes from here --> https://platform.openai.com/account/api-keys
Your Organization ID comes from here --> https://platform.openai.com/account/org-settings

Without using dependency injection:

var openAiService = new OpenAIService(new OpenAiOptions()
{
    ApiKey =  Environment.GetEnvironmentVariable("MY_OPEN_AI_API_KEY")
});

Using dependency injection:

secrets.json:
 "OpenAIServiceOptions": {
    //"ApiKey":"Your api key goes here"
    //,"Organization": "Your Organization Id goes here (optional)"
  },

(How to use user secret ?
Right click your project name in "solution explorer" then click "Manage User Secret", it is a good way to keep your api keys)

Program.cs
serviceCollection.AddOpenAIService();

OR
Use it like below but do NOT put your API key directly to your source code.

Program.cs
serviceCollection.AddOpenAIService(settings => { settings.ApiKey = Environment.GetEnvironmentVariable("MY_OPEN_AI_API_KEY"); });

After injecting your service you will be able to get it from service provider

var openAiService = serviceProvider.GetRequiredService<IOpenAIService>();

You can set default model(optional):

openAiService.SetDefaultModelId(Models.Davinci);

Chat Gpt Sample

var completionResult = await openAiService.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest
{
    Messages = new List<ChatMessage>
    {
        ChatMessage.FromSystem("You are a helpful assistant."),
        ChatMessage.FromUser("Who won the world series in 2020?"),
        ChatMessage.FromAssistant("The Los Angeles Dodgers won the World Series in 2020."),
        ChatMessage.FromUser("Where was it played?")
    },
    Model = Models.ChatGpt3_5Turbo,
    MaxTokens = 50//optional
});
if (completionResult.Successful)
{
   Console.WriteLine(completionResult.Choices.First().Message.Content);
}

Function Sample

var fn1 = new FunctionDefinitionBuilder("get_current_weather", "Get the current weather")
            .AddParameter("location", PropertyDefinition.DefineString("The city and state, e.g. San Francisco, CA"))
            .AddParameter("format", PropertyDefinition.DefineEnum(new List<string> { "celsius", "fahrenheit" }, "The temperature unit to use. Infer this from the users location."))
            .Validate()
            .Build();

        var fn2 = new FunctionDefinitionBuilder("get_n_day_weather_forecast", "Get an N-day weather forecast")
            .AddParameter("location", new() { Type = "string", Description = "The city and state, e.g. San Francisco, CA" })
            .AddParameter("format", PropertyDefinition.DefineEnum(new List<string> { "celsius", "fahrenheit" }, "The temperature unit to use. Infer this from the users location."))
            .AddParameter("num_days", PropertyDefinition.DefineInteger("The number of days to forecast"))
            .Validate()
            .Build();
        var fn3 = new FunctionDefinitionBuilder("get_current_datetime", "Get the current date and time, e.g. 'Saturday, June 24, 2023 6:14:14 PM'")
            .Build();

        var fn4 = new FunctionDefinitionBuilder("identify_number_sequence", "Get a sequence of numbers present in the user message")
            .AddParameter("values", PropertyDefinition.DefineArray(PropertyDefinition.DefineNumber("Sequence of numbers specified by the user")))
            .Build();

        var tools = new List<ToolDefinition>()
        {
            new ToolDefinition() { Function = fn1 },
            new ToolDefinition() { Function = fn2 },
            new ToolDefinition() { Function = fn3 },
            new ToolDefinition() { Function = fn4 },
        }

        ConsoleExtensions.WriteLine("Chat Function Call Test:", ConsoleColor.DarkCyan);
        var completionResult = await sdk.ChatCompletion.CreateCompletion(new ChatCompletionCreateRequest
        {
            Messages = new List<ChatMessage>
                {
                    ChatMessage.FromSystem("Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."),
                    ChatMessage.FromUser("Give me a weather report for Chicago, USA, for the next 5 days.")
                },
            Tools = tools,
            MaxTokens = 50,
            Model = Models.Gpt_3_5_Turbo
        });

        if (completionResult.Successful)
        {
            var choice = completionResult.Choices.First();
            Console.WriteLine($"Message:        {choice.Message.Content}");

            var fn = choice.Message.FunctionCall;
            if (fn != null)
            {
                Console.WriteLine($"Function call:  {fn.Name}");
                foreach (var entry in fn.ParseArguments())
                {
                    Console.WriteLine($"  {entry.Key}: {entry.Value}");
                }
            }
        }

Completions Stream Sample

var completionResult = openAiService.Completions.CreateCompletionAsStream(new CompletionCreateRequest()
   {
      Prompt = "Once upon a time",
      MaxTokens = 50
   }, Models.Davinci);

   await foreach (var completion in completionResult)
   {
      if (completion.Successful)
      {
         Console.Write(completion.Choices.FirstOrDefault()?.Text);
      }
      else
      {
         if (completion.Error == null)
         {
            throw new Exception("Unknown Error");
         }

         Console.WriteLine($"{completion.Error.Code}: {completion.Error.Message}");
      }
   }
   Console.WriteLine("Complete");

DALL·E Sample

var imageResult = await openAiService.Image.CreateImage(new ImageCreateRequest
{
    Prompt = "Laser cat eyes",
    N = 2,
    Size = StaticValues.ImageStatics.Size.Size256,
    ResponseFormat = StaticValues.ImageStatics.ResponseFormat.Url,
    User = "TestUser"
});


if (imageResult.Successful)
{
    Console.WriteLine(string.Join("\n", imageResult.Results.Select(r => r.Url)));
}

VISION Sample

var completionResult = await sdk.ChatCompletion.CreateCompletion(
    new ChatCompletionCreateRequest
    {
        Messages = new List<ChatMessage>
        {
            ChatMessage.FromSystem("You are an image analyzer assistant."),
            ChatMessage.FromUser(
                new List<MessageContent>
                {
                    MessageContent.TextContent("What is on the picture in details?"),
                    MessageContent.ImageUrlContent(
                        "https://www.digitaltrends.com/wp-content/uploads/2016/06/1024px-Bill_Cunningham_at_Fashion_Week_photographed_by_Jiyang_Chen.jpg?p=1",
                        ImageStatics.ImageDetailTypes.High
                    )
                }
            ),
        },
        MaxTokens = 300,
        Model = Models.Gpt_4_vision_preview,
        N = 1
    }
);

if (completionResult.Successful)
{
    Console.WriteLine(completionResult.Choices.First().Message.Content);
}

VISION Sample using Base64 encoded image

const string fileName = "image.png";
var binaryImage = await FileExtensions.ReadAllBytesAsync(fileName);

var completionResult = await sdk.ChatCompletion.CreateCompletion(
    new ChatCompletionCreateRequest
    {
        Messages = new List<ChatMessage>
        {
            ChatMessage.FromSystem("You are an image analyzer assistant."),
            ChatMessage.FromUser(
                new List<MessageContent>
                {
                    MessageContent.TextContent("What is on the picture in details?"),
                    MessageContent.ImageBinaryContent(
                        binaryImage,
                        ImageStatics.ImageFileTypes.Png,
                        ImageStatics.ImageDetailTypes.High
                    )
                }
            ),
        },
        MaxTokens = 300,
        Model = Models.Gpt_4_vision_preview,
        N = 1
    }
);

if (completionResult.Successful)
{
    Console.WriteLine(completionResult.Choices.First().Message.Content);
}

Notes:

This library used to be known as Betalgo.OpenAI.GPT3, now it has a new package Id Betalgo.OpenAI.

Please note that due to time constraints, I was unable to thoroughly test all of the methods or fully document the library. If you encounter any issues, please do not hesitate to report them or submit a pull request - your contributions are always appreciated.

I initially developed this SDK for my personal use and later decided to share it with the community. As I have not maintained any open-source projects before, any assistance or feedback would be greatly appreciated. If you would like to contribute in any way, please feel free to reach out to me with your suggestions.

I will always be using the latest libraries, and future releases will frequently include breaking changes. Please take this into consideration before deciding to use the library. I want to make it clear that I cannot accept any responsibility for any damage caused by using the library. If you feel that this is not suitable for your purposes, you are free to explore alternative libraries or the OpenAI Web-API.

I am incredibly busy. If I forgot your name, please accept my apologies and let me know so I can add it to the list.

Changelog

8.1.0

  • Added support for Batch API

8.0.1

  • Added support for new Models gpt-4-turbo and gpt-4-turbo-2024-04-09 thanks to @ChaseIngersol

8.0.0

  • Added support for .NET 8.0 thanks to @BroMarduk
  • Utilities library updated to work with only .NET 8.0

7.4.7

  • Fixed a bug that was causing binary image to be sent as base64 string, Thanks to @yt3trees
  • Fixed a bug that was blocking CreateCompletionAsStream on some platforms. #331
  • Fixed a bug that was causing an error with multiple tool calls, now we are handling index parameter #493, thanks to @David-Buyer

7.4.6

  • Fixed again🥲 incorrect Model Naming - moderation models and ada embedding 2 model

7.4.5

  • Fixed function calling streaming bugs thanks to @David-Buyer @dogdie233 @gavi @Maracaipe611
  • Breaking Change: While streaming (CreateCompletionAsStream), there were some unexpected incoming data chunks like :pings or :events, etc. @gavi discovered this issue. We are now ignoring these chunks. If you were using it, you need to set justDataMode to false.

7.4.4

  • Added support for new models : TextEmbeddingV3Small, TextEmbeddingV3Large, Gpt_3_5_Turbo_0125, Gpt_4_0125_preview, Gpt_4_turbo_preview, Text_moderation_007, Text_moderation_latest, Text_moderation_stable
  • Added optinal dimension and encoding for embedding thanks to @shanepowell

7.4.3

  • Fixed the response format of AudioCreateSpeechRequest.
  • Updated Azure OpenAI version to 2023-12-01-preview, which now supports dall-e 3.
  • Added the ability to retrieve header values from the base response, such as ratelimit, etc. Please note that this feature is experimental and may change in the future.
  • Semi-Breaking change:
    • The SDK will now attempt to handle 500 errors and other similar errors from the OpenAI server. Previously, an exception was thrown in such cases. Now, the SDK will try to read the response and return it as an error message. This change provides more visibility to developers and helps them understand the cause of the error.

7.4.2

  • Let's start with breaking changes:
    • OpenAI has replaced function calling with tools. We have made the necessary changes to our code. This is not a major change; now you just have a wrapper around your function calling, which is named as "tool". The Playground provides an example. Please take a look to see how you can update your code.
      This update was completed by @shanepowell. Many thanks to him.
  • Now we support the Vision API, which involves passing message contents to the existing chat method. It is quite easy to use, but documentation was not available in the OpenAI API documentation.
    This feature was completed by @belaszalontai. Many thanks to them.

7.4.1

  • Added support for "Create Speech" thanks to @belaszalontai / @szabe74

7.4.0

  • Added support for Dall-e 3, thanks to @belaszalontai and @szabe74
  • Added support for GPT-4-Turbo/Vision thanks to @ChaseIngersol
  • Models are updated with latest.

7.3.1

  • Reverting a breking change which will be also Breaking Changes(only for 7.3.0):
    • Reverting the usage of EnsureStatusCode() which caused the loss of error information. Initially, I thought it would help in implementing HTTP retry tools, but now I believe it is a bad idea for two reasons.
      1. You can't simply retry if the request wasn't successful because it could fail for various reasons. For example, you might have used too many tokens in your request, causing OpenAI to reject the response, or you might have tried to use a nonexistent model. It would be better to use the Error object in your retry rules. All responses are already derived from this base object.
      2. We will lose error response data.
Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (15)

Showing the top 5 NuGet packages that depend on Betalgo.OpenAI:

Package Downloads
Soenneker.OpenAI.Client The ID prefix of this package has been reserved for one of the owners of this package by NuGet.org.

An async thread-safe singleton for the OpenAI client by Betalgo

Squidex.CLI.Core

Command Line Interface for Squidex Headless CMS

Betalgo.OpenAI.Utilities

Utility tools for Betalgo.OpenAI - Dotnet SDK for OpenAI ChatGPT, Whisper, GPT-4 and DALL·E

OuroborosAI.Core

Powerful layer on top of OpenAI supporting chaining and recursion scenarios. Includes fluent SDK for chaining, templates, and text processing. Still very early an .NET 7 only; Docs and broader support to follow.

Oraculum

Library to create and organize factual knowledge inside a Weaviate vector DB and define AI GPT based assistants using this knowledge.

GitHub repositories (4)

Showing the top 4 popular GitHub repositories that depend on Betalgo.OpenAI:

Repository Stars
betalgo/openai
OpenAI .NET sdk - Azure OpenAI, ChatGPT, Whisper, and DALL-E
cmu-sei/GHOSTS
GHOSTS is a realistic user simulation framework for cyber simulation, training, and exercise
FireCubeStudios/Clippy
Bring back Clippy on Windows 10/11!
FireCubeStudios/Run
WinUI 3 Run
Version Downloads Last updated
8.2.0-beta 177 4/20/2024
8.1.1 2,970 4/20/2024
8.1.0 3,130 4/16/2024
8.0.1 3,707 4/11/2024
8.0.0 1,997 4/9/2024
7.4.7 2,347 4/6/2024
7.4.6 79,481 2/7/2024
7.4.5 6,063 2/1/2024
7.4.4 1,777 1/30/2024
7.4.3 81,642 12/11/2023
7.4.2 6,763 12/6/2023
7.4.1 39,059 11/15/2023
7.4.0 13,790 11/10/2023
7.3.1 21,677 10/19/2023
7.2.0 24,799 10/6/2023
7.1.5 124,711 8/6/2023
7.1.3 8,546 7/30/2023
7.0.0 154,272 5/31/2023
6.8.6 45,003 5/11/2023
6.8.5 38,259 4/19/2023
6.8.2 919 3/28/2023