Microsoft.Extensions.AI.AzureAIInference
9.0.1-preview.1.24570.5
Prefix Reserved
.NET 8.0
This package targets .NET 8.0. The package is compatible with this framework or higher.
.NET Standard 2.0
This package targets .NET Standard 2.0. The package is compatible with this framework or higher.
.NET Framework 4.6.2
This package targets .NET Framework 4.6.2. The package is compatible with this framework or higher.
This is a prerelease version of Microsoft.Extensions.AI.AzureAIInference.
dotnet add package Microsoft.Extensions.AI.AzureAIInference --version 9.0.1-preview.1.24570.5
NuGet\Install-Package Microsoft.Extensions.AI.AzureAIInference -Version 9.0.1-preview.1.24570.5
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Microsoft.Extensions.AI.AzureAIInference" Version="9.0.1-preview.1.24570.5" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Microsoft.Extensions.AI.AzureAIInference --version 9.0.1-preview.1.24570.5
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: Microsoft.Extensions.AI.AzureAIInference, 9.0.1-preview.1.24570.5"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Microsoft.Extensions.AI.AzureAIInference as a Cake Addin #addin nuget:?package=Microsoft.Extensions.AI.AzureAIInference&version=9.0.1-preview.1.24570.5&prerelease // Install Microsoft.Extensions.AI.AzureAIInference as a Cake Tool #tool nuget:?package=Microsoft.Extensions.AI.AzureAIInference&version=9.0.1-preview.1.24570.5&prerelease
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
Microsoft.Extensions.AI.AzureAIInference
Provides an implementation of the IChatClient
interface for the Azure.AI.Inference
package.
Install the package
From the command-line:
dotnet add package Microsoft.Extensions.AI.AzureAIInference
Or directly in the C# project file:
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.AI.AzureAIInference" Version="[CURRENTVERSION]" />
</ItemGroup>
Usage Examples
Chat
using Azure;
using Microsoft.Extensions.AI;
IChatClient client =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
Console.WriteLine(await client.CompleteAsync("What is AI?"));
Chat + Conversation History
using Azure;
using Microsoft.Extensions.AI;
IChatClient client =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
Console.WriteLine(await client.CompleteAsync(
[
new ChatMessage(ChatRole.System, "You are a helpful AI assistant"),
new ChatMessage(ChatRole.User, "What is AI?"),
]));
Chat streaming
using Azure;
using Microsoft.Extensions.AI;
IChatClient client =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
await foreach (var update in client.CompleteStreamingAsync("What is AI?"))
{
Console.Write(update);
}
Tool calling
using System.ComponentModel;
using Azure;
using Microsoft.Extensions.AI;
IChatClient azureClient =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
IChatClient client = new ChatClientBuilder(azureClient)
.UseFunctionInvocation()
.Build();
ChatOptions chatOptions = new()
{
Tools = [AIFunctionFactory.Create(GetWeather)]
};
await foreach (var message in client.CompleteStreamingAsync("Do I need an umbrella?", chatOptions))
{
Console.Write(message);
}
[Description("Gets the weather")]
static string GetWeather() => Random.Shared.NextDouble() > 0.5 ? "It's sunny" : "It's raining";
Caching
using Azure;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
IDistributedCache cache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
IChatClient azureClient =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
IChatClient client = new ChatClientBuilder(azureClient)
.UseDistributedCache(cache)
.Build();
for (int i = 0; i < 3; i++)
{
await foreach (var message in client.CompleteStreamingAsync("In less than 100 words, what is AI?"))
{
Console.Write(message);
}
Console.WriteLine();
Console.WriteLine();
}
Telemetry
using Azure;
using Microsoft.Extensions.AI;
using OpenTelemetry.Trace;
// Configure OpenTelemetry exporter
var sourceName = Guid.NewGuid().ToString();
var tracerProvider = OpenTelemetry.Sdk.CreateTracerProviderBuilder()
.AddSource(sourceName)
.AddConsoleExporter()
.Build();
IChatClient azureClient =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
IChatClient client = new ChatClientBuilder(azureClient)
.UseOpenTelemetry(sourceName, c => c.EnableSensitiveData = true)
.Build();
Console.WriteLine(await client.CompleteAsync("What is AI?"));
Telemetry, Caching, and Tool Calling
using System.ComponentModel;
using Azure;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
using OpenTelemetry.Trace;
// Configure telemetry
var sourceName = Guid.NewGuid().ToString();
var tracerProvider = OpenTelemetry.Sdk.CreateTracerProviderBuilder()
.AddSource(sourceName)
.AddConsoleExporter()
.Build();
// Configure caching
IDistributedCache cache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
// Configure tool calling
var chatOptions = new ChatOptions
{
Tools = [AIFunctionFactory.Create(GetPersonAge)]
};
IChatClient azureClient =
new Azure.AI.Inference.ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!))
.AsChatClient("gpt-4o-mini");
IChatClient client = new ChatClientBuilder(azureClient)
.UseDistributedCache(cache)
.UseFunctionInvocation()
.UseOpenTelemetry(sourceName, c => c.EnableSensitiveData = true)
.Build();
for (int i = 0; i < 3; i++)
{
Console.WriteLine(await client.CompleteAsync("How much older is Alice than Bob?", chatOptions));
}
[Description("Gets the age of a person specified by name.")]
static int GetPersonAge(string personName) =>
personName switch
{
"Alice" => 42,
"Bob" => 35,
_ => 26,
};
Dependency Injection
using Azure;
using Azure.AI.Inference;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
// App Setup
var builder = Host.CreateApplicationBuilder();
builder.Services.AddSingleton(
new ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("GH_TOKEN")!)));
builder.Services.AddDistributedMemoryCache();
builder.Services.AddLogging(b => b.AddConsole().SetMinimumLevel(LogLevel.Trace));
builder.Services.AddChatClient(services => services.GetRequiredService<ChatCompletionsClient>().AsChatClient("gpt-4o-mini"))
.UseDistributedCache()
.UseLogging();
var app = builder.Build();
// Elsewhere in the app
var chatClient = app.Services.GetRequiredService<IChatClient>();
Console.WriteLine(await chatClient.CompleteAsync("What is AI?"));
Minimal Web API
using Azure;
using Azure.AI.Inference;
using Microsoft.Extensions.AI;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddSingleton(new ChatCompletionsClient(
new("https://models.inference.ai.azure.com"),
new AzureKeyCredential(builder.Configuration["GH_TOKEN"]!)));
builder.Services.AddChatClient(services =>
services.GetRequiredService<ChatCompletionsClient>().AsChatClient("gpt-4o-mini"));
var app = builder.Build();
app.MapPost("/chat", async (IChatClient client, string message) =>
{
var response = await client.CompleteAsync(message);
return response.Message;
});
app.Run();
Feedback & Contributing
We welcome feedback and contributions in our GitHub repo.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 is compatible. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
.NETFramework 4.6.2
- Azure.AI.Inference (>= 1.0.0-beta.2)
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.1-preview.1.24570.5)
- System.Memory.Data (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
.NETStandard 2.0
- Azure.AI.Inference (>= 1.0.0-beta.2)
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.1-preview.1.24570.5)
- System.Memory.Data (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
net8.0
- Azure.AI.Inference (>= 1.0.0-beta.2)
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.1-preview.1.24570.5)
- System.Memory.Data (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
net9.0
- Azure.AI.Inference (>= 1.0.0-beta.2)
- Microsoft.Bcl.AsyncInterfaces (>= 9.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.1-preview.1.24570.5)
- System.Memory.Data (>= 9.0.0)
- System.Text.Json (>= 9.0.0)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Microsoft.Extensions.AI.AzureAIInference:
Package | Downloads |
---|---|
Microsoft.SemanticKernel.Connectors.AzureAIInference
Semantic Kernel Model as a Service connectors for Azure AI Studio. Contains clients for chat completion, embeddings and text to image generation. |
GitHub repositories (2)
Showing the top 2 popular GitHub repositories that depend on Microsoft.Extensions.AI.AzureAIInference:
Repository | Stars |
---|---|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
|
dotnet/ai-samples
|
Version | Downloads | Last updated |
---|---|---|
9.0.1-preview.1.24570.5 | 950 | 11/21/2024 |
9.0.0-preview.9.24556.5 | 1,016 | 11/12/2024 |
9.0.0-preview.9.24525.1 | 6,014 | 10/26/2024 |
9.0.0-preview.9.24507.7 | 1,812 | 10/8/2024 |