Azure.AI.Projects.Agents
2.0.0-beta.1
Prefix Reserved
dotnet add package Azure.AI.Projects.Agents --version 2.0.0-beta.1
NuGet\Install-Package Azure.AI.Projects.Agents -Version 2.0.0-beta.1
<PackageReference Include="Azure.AI.Projects.Agents" Version="2.0.0-beta.1" />
<PackageVersion Include="Azure.AI.Projects.Agents" Version="2.0.0-beta.1" />
<PackageReference Include="Azure.AI.Projects.Agents" />
paket add Azure.AI.Projects.Agents --version 2.0.0-beta.1
#r "nuget: Azure.AI.Projects.Agents, 2.0.0-beta.1"
#:package Azure.AI.Projects.Agents@2.0.0-beta.1
#addin nuget:?package=Azure.AI.Projects.Agents&version=2.0.0-beta.1&prerelease
#tool nuget:?package=Azure.AI.Projects.Agents&version=2.0.0-beta.1&prerelease
Azure AI Projects Agents client library for .NET
Develop Agents using the Azure AI Foundry platform, leveraging an extensive ecosystem of models, tools, and capabilities from OpenAI, Microsoft, and other LLM providers.
Note: This package is dedicated to perform CRUD operations on Agents and can be used to enable the telemetry.
Product documentation | Samples | API reference documentation | Package (NuGet) | SDK source code
Table of contents
- Getting started
- Key concepts
- Additional concepts
- Examples
- Tracing
- Troubleshooting
- Next steps
- Contributing
Getting started
Prerequisites
To use Azure AI Agents capabilities, you must have an Azure subscription. This will allow you to create an Azure AI resource and get a connection URL.
Install the package
Install the client library for .NET with NuGet:
dotnet add package Azure.AI.Extensions.OpenAI --prerelease
You must have an Azure subscription and Cosmos DB account (SQL API). In order to take advantage of the C# 8.0 syntax, it is recommended that you compile using the .NET Core SDK 3.0 or higher with a language version of
latest. It is also possible to compile with the .NET Core SDK 2.1.x using a language version ofpreview.
Authenticate the client
To be able to create, update and delete Agents, please use AgentsClient. It is a good practice to only allow this operation for users with elevated permissions, for example, administrators.
var projectEndpoint = System.Environment.GetEnvironmentVariable("FOUNDRY_PROJECT_ENDPOINT");
var modelDeploymentName = System.Environment.GetEnvironmentVariable("FOUNDRY_MODEL_NAME");
AgentsClientOptions options = new()
{
Endpoint = new Uri(projectEndpoint)
};
AgentsClient agentsClient = new(tokenProvider: new DefaultAzureCredential(), options: options);
Key concepts
Service API versions
When clients send REST requests to the endpoint, one of the query parameters is api-version. It allows us to select the API versions supporting different features. The current stable version is v1 (default).
Select a service API version
The API version may be set supplying version parameter to AgentClientOptions constructor as shown in the example code below.
AgentsClientOptions options = new()
{
Endpoint = new Uri(projectEndpoint),
ApiVersion = "2025-11-15-preview"
};
AgentsClient agentsClient = new(tokenProvider: new DefaultAzureCredential(), options: options);
Additional concepts
The Azure.AI.Projects.Agents framework organized in a way that for each call, requiring the REST API request, there are synchronous and asynchronous counterparts where the letter has the "Async" suffix. For example, the following code demonstrates the creation of a AgentVersion object.
Synchronous call:
PromptAgentDefinition agentDefinition = new(model: modelDeploymentName)
{
Instructions = "You are a prompt agent."
};
AgentVersion agentVersion1 = agentsClient.CreateAgentVersion(
agentName: "myAgent1",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion1.Id}, name: {agentVersion1.Name}, version: {agentVersion1.Version})");
AgentVersion agentVersion2 = agentsClient.CreateAgentVersion(
agentName: "myAgent2",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion2.Id}, name: {agentVersion2.Name}, version: {agentVersion2.Version})");
Asynchronous call:
PromptAgentDefinition agentDefinition = new(model: modelDeploymentName)
{
Instructions = "You are a prompt agent."
};
AgentVersion agentVersion1 = await agentsClient.CreateAgentVersionAsync(
agentName: "myAgent1",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion1.Id}, name: {agentVersion1.Name}, version: {agentVersion1.Version})");
AgentVersion agentVersion2 = await agentsClient.CreateAgentVersionAsync(
agentName: "myAgent2",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion2.Id}, name: {agentVersion2.Name}, version: {agentVersion2.Version})");
In the most of code snippets we will show only asynchronous sample for brevity. Please refer individual samples for both synchronous and asynchronous code.
Examples
Prompt Agents
When creating the Agents we need to supply Agent definitions to its constructor. To create a declarative prompt Agent, use the PromptAgentDefinition:
PromptAgentDefinition agentDefinition = new(model: modelDeploymentName)
{
Instructions = "You are a prompt agent."
};
AgentVersion agentVersion1 = await agentsClient.CreateAgentVersionAsync(
agentName: "myAgent1",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion1.Id}, name: {agentVersion1.Name}, version: {agentVersion1.Version})");
AgentVersion agentVersion2 = await agentsClient.CreateAgentVersionAsync(
agentName: "myAgent2",
options: new(agentDefinition));
Console.WriteLine($"Agent created (id: {agentVersion2.Id}, name: {agentVersion2.Name}, version: {agentVersion2.Version})");
The code above will result in creation of AgentVersion object, which is the data object containing Agent's name and version.
Hosted Agents
Note: This feature is in the preview, to use it, please disable the AAIP001 warning.
#pragma warning disable AAIP001
Hosted agents simplify the custom agent deployment on fully controlled environment see more.
To use hosted agent we need to provide the Foundry-Features header in our REST requests. It can be done using PipelinePolicy.
internal class FeaturePolicy(string feature) : PipelinePolicy
{
private const string _FEATURE_HEADER = "Foundry-Features";
public override void Process(PipelineMessage message, IReadOnlyList<PipelinePolicy> pipeline, int currentIndex)
{
message.Request.Headers.Add(_FEATURE_HEADER, feature);
ProcessNext(message, pipeline, currentIndex);
}
public override async ValueTask ProcessAsync(PipelineMessage message, IReadOnlyList<PipelinePolicy> pipeline, int currentIndex)
{
message.Request.Headers.Add(_FEATURE_HEADER, feature);
await ProcessNextAsync(message, pipeline, currentIndex);
}
}
To create the hosted agent, please use the HostedAgentDefinition while creating the AgentVersion object.
private static HostedAgentDefinition GetAgentDefinition(string dockerImage, string modelDeploymentName, string accountId, string applicationInsightConnectionString, string projectEndpoint)
{
HostedAgentDefinition agentDefinition = new(
versions: [new ProtocolVersionRecord(AgentProtocol.ActivityProtocol, "v1")],
cpu: "1",
memory: "2Gi"
)
{
EnvironmentVariables = {
{ "AZURE_OPENAI_ENDPOINT", $"https://{accountId}.cognitiveservices.azure.com/" },
{ "AZURE_OPENAI_CHAT_DEPLOYMENT_NAME", modelDeploymentName },
// Optional variables, used for logging
{ "APPLICATIONINSIGHTS_CONNECTION_STRING", applicationInsightConnectionString },
{ "AGENT_PROJECT_RESOURCE_ID", projectEndpoint },
},
Image = dockerImage,
};
return agentDefinition;
}
The created agent needs to be deployed using Azure CLI
az login
az cognitiveservices agent start --account-name ACCOUNTNAME --project-name PROJECTNAME --name myHostedAgent --agent-version 1
After the deployment is complete, this Agent can be used for calling responses.
Agent deletion should be done through Azure CLI.
az cognitiveservices agent delete-deployment --account-name ACCOUNTNAME --project-name PROJECTNAME --name myHostedAgent --agent-version 1
az cognitiveservices agent delete --account-name ACCOUNTNAME --project-name PROJECTNAME --name myHostedAgent --agent-version 1
Tracing
Note: Tracing functionality is in preliminary preview and is subject to change. Spans, attributes, and events may be modified in future versions.
Environment variable values: All tracing-related environment variables accept
true(case-insensitive) or1as equivalent enabling values.
Enabling GenAI Tracing
Tracing requires enabling GenAI-specific OpenTelemetry support. One way to do this is to set the AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING environment variable value to true. You can also enable the feature with the following code:
AppContext.SetSwitch("Azure.Experimental.EnableGenAITracing", true);
Precedence: If both the
AppContextswitch and the environment variable are set, theAppContextswitch takes priority. No exception is thrown on conflict. If neither is set, the value defaults tofalse.
Important: When you enable Azure.Experimental.EnableGenAITracing, the SDK automatically enables the Azure.Experimental.EnableActivitySource flag, which is required for the OpenTelemetry instrumentation to function.
You can add an Application Insights Azure resource to your Microsoft Foundry project. If one was enabled, you can get the Application Insights connection string, configure your AI Projects client, and observe traces in Azure Monitor. Typically, you might want to start tracing before you create a client or Agent.
Tracing to Azure Monitor
First, set the APPLICATIONINSIGHTS_CONNECTION_STRING environment variable to point to your Azure Monitor resource.
For tracing to Azure Monitor from your application, the preferred option is to use Azure.Monitor.OpenTelemetry.AspNetCore. Install the package with NuGet:
dotnet add package Azure.Monitor.OpenTelemetry.AspNetCore
More information about using the Azure.Monitor.OpenTelemetry.AspNetCore package can be found here.
Another option is to use Azure.Monitor.OpenTelemetry.Exporter package. Install the package with NuGet:
dotnet add package Azure.Monitor.OpenTelemetry.Exporter
Here is an example how to set up tracing to Azure Monitor using Azure.Monitor.OpenTelemetry.Exporter:
var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource("Azure.AI.Projects.*")
.SetResourceBuilder(ResourceBuilder.CreateDefault().AddService("AgentTracingSample"))
.AddAzureMonitorTraceExporter().Build();
Tracing to Console
For tracing to console from your application, install the OpenTelemetry.Exporter.Console with NuGet:
dotnet add package OpenTelemetry.Exporter.Console
Here is an example how to set up tracing to console:
var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource("Azure.AI.Projects.*") // Add the required sources name
.SetResourceBuilder(OpenTelemetry.Resources.ResourceBuilder.CreateDefault().AddService("AgentTracingSample"))
.AddConsoleExporter() // Export traces to the console
.Build();
Enabling content recording
Content recording controls whether message contents and tool call related details, such as parameters and return values, are captured with the traces. This data may include sensitive user information.
To enable content recording, set the OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT environment variable to true. Alternatively, you can control content recording with the following code:
AppContext.SetSwitch("Azure.Experimental.TraceGenAIMessageContent", false);
If neither the environment variable nor the AppContext switch is set, content recording defaults to false.
Precedence: If both the
AppContextswitch and the environment variable are set, theAppContextswitch takes priority. No exception is thrown on conflict.
Troubleshooting
Any operation that fails will throw a ClientResultException. The exception's Status will hold the HTTP response status code. The exception's Message contains a detailed message that may be helpful in diagnosing the issue:
try
{
AgentVersion agent = await agentsClient.GetAgentVersionAsync(
agentName: "agent_which_dies_not_exist", agentVersion: "1");
}
catch (ClientResultException e) when (e.Status == 404)
{
Console.WriteLine($"Exception status code: {e.Status}");
Console.WriteLine($"Exception message: {e.Message}");
}
To further diagnose and troubleshoot issues, you can enable logging following the Azure SDK logging documentation. This allows you to capture additional insights into request and response details, which can be particularly helpful when diagnosing complex issues.
Next steps
Beyond the introductory scenarios discussed, the AI Agents client library offers support for additional scenarios to help take advantage of the full feature set of the AI services. To help explore some of these scenarios, the AI Agents client library offers a set of samples to serve as an illustration for common scenarios. Please see the Samples
Contributing
See the Azure SDK CONTRIBUTING.md for details on building, testing, and contributing to this library.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Azure.Core (>= 1.51.1)
- OpenAI (>= 2.9.1)
-
net10.0
- Azure.Core (>= 1.51.1)
- OpenAI (>= 2.9.1)
-
net8.0
- Azure.Core (>= 1.51.1)
- OpenAI (>= 2.9.1)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Azure.AI.Projects.Agents:
| Package | Downloads |
|---|---|
|
Azure.AI.Projects
This is the Azure.AI.Projects client library for developing .NET applications with rich experience. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 2.0.0-beta.1 | 268 | 3/18/2026 |