Skip to main content

Azure OpenAI SDK v2 Migration in C#: All Breaking Changes

Intermediate Original .NET 9 Azure.AI.OpenAI 2.1.0 OpenAI 2.1.0
By Rajesh Mishra · Mar 21, 2026 · 13 min read
Verified Mar 2026 .NET 9 Azure.AI.OpenAI 2.1.0
In 30 Seconds

Azure.AI.OpenAI v2 restructures the .NET Azure OpenAI SDK as a thin wrapper over the official OpenAI .NET SDK. The primary change is that AzureOpenAIClient replaces OpenAIClient, and all chat types (ChatMessage, ChatCompletion, ChatClient) now come from the OpenAI.Chat namespace. ChatRequestUserMessage becomes UserChatMessage, ChatCompletions becomes ChatCompletion (singular), and streaming now uses CompleteChatStreamingAsync returning AsyncCollectionResult<StreamingChatCompletionUpdate>. Function calling uses ChatTool.CreateFunctionTool instead of FunctionDefinition. The migration is mechanical but comprehensive — nearly every type name changed.

The Azure.AI.OpenAI v2 SDK is not a minor update — it is a complete redesign. If you wrote Azure OpenAI code against v1, every type name related to chat is different. The compiler will tell you about all of them at once. This guide walks through every change systematically, shows you what the v1 code looked like, and gives you the exact v2 replacement for each pattern.

1. What Changed and Why

In v1, Azure.AI.OpenAI was a standalone SDK that duplicated OpenAI’s types. In v2, it became a thin Azure-specific wrapper over the official OpenAI .NET SDK maintained by OpenAI directly. The result is two packages that work together:

PackageWhat it provides
Azure.AI.OpenAI 2.xAzureOpenAIClient, AzureKeyCredential, AzureOpenAIClientOptions
OpenAI 2.xChatClient, ChatMessage, ChatCompletion, ChatTool, and all core types

Your v1 code imported Azure.AI.OpenAI for everything. Your v2 code imports from both namespaces — Azure.AI.OpenAI for the client, OpenAI.Chat for the types.

This architectural shift also means Azure OpenAI and non-Azure OpenAI .NET code now share the same types. If you have an abstraction layer (an IChatService that could run against either endpoint), the payload types no longer need to differ.

2. Package Changes

v1 csproj

<PackageReference Include="Azure.AI.OpenAI" Version="1.0.0-beta.17" />

v2 csproj

<PackageReference Include="Azure.AI.OpenAI" Version="2.1.0" />

The OpenAI package (version 2.1.0) is pulled in automatically as a transitive dependency. You do not need to add it explicitly, but you can pin the version if you need a specific build:

<!-- Optional: pin OpenAI base package version -->
<PackageReference Include="OpenAI" Version="2.1.0" />

To upgrade via the CLI:

dotnet remove package Azure.AI.OpenAI
dotnet add package Azure.AI.OpenAI --version 2.1.0

3. Client Initialization

This is the first change that breaks compilation.

v1 — OpenAIClient

using Azure;
using Azure.AI.OpenAI;

// v1: OpenAIClient took endpoint + credential directly
var client = new OpenAIClient(
    new Uri("https://<resource>.openai.azure.com/"),
    new AzureKeyCredential("<api-key>"));

// Chat was called on the client with the deployment name as a parameter
ChatCompletionsOptions options = new()
{
    DeploymentName = "my-gpt4o-deployment",
    Messages =
    {
        new ChatRequestSystemMessage("You are a helpful assistant."),
        new ChatRequestUserMessage("What is dependency injection?")
    }
};

Response<ChatCompletions> response = await client.GetChatCompletionsAsync(options);
string reply = response.Value.Choices[0].Message.Content;

v2 — AzureOpenAIClient + ChatClient

using Azure;
using Azure.AI.OpenAI;
using OpenAI.Chat;

// v2: AzureOpenAIClient is the Azure entry point
var azureClient = new AzureOpenAIClient(
    new Uri("https://<resource>.openai.azure.com/"),
    new AzureKeyCredential("<api-key>"));

// Get a ChatClient scoped to a specific deployment
ChatClient chatClient = azureClient.GetChatClient("my-gpt4o-deployment");

// Messages are now typed from OpenAI.Chat
List<ChatMessage> messages =
[
    new SystemChatMessage("You are a helpful assistant."),
    new UserChatMessage("What is dependency injection?")
];

// CompleteChatAsync returns ClientResult<ChatCompletion>
ClientResult<ChatCompletion> result = await chatClient.CompleteChatAsync(messages);
ChatCompletion completion = result.Value;
string reply = completion.Content[0].Text;

Two structural changes to note:

  1. GetChatClient(deploymentName) replaces embedding the deployment name inside ChatCompletionsOptions. You create one ChatClient per deployment and reuse it.
  2. ClientResult<ChatCompletion> wraps the response. Always access .Value to get the actual ChatCompletion object.

4. Chat Message Types

Every message type was renamed. The v1 ChatRequest* prefix is gone.

v1 Typev2 TypeNamespace
ChatRequestSystemMessageSystemChatMessageOpenAI.Chat
ChatRequestUserMessageUserChatMessageOpenAI.Chat
ChatRequestAssistantMessageAssistantChatMessageOpenAI.Chat

There are also factory methods on the ChatMessage base class if you prefer:

using OpenAI.Chat;

// Constructor style (preferred for clarity)
var system = new SystemChatMessage("You are a helpful assistant.");
var user = new UserChatMessage("Explain LINQ.");
var assistant = new AssistantChatMessage("LINQ stands for Language Integrated Query...");

// Factory method style (equivalent)
ChatMessage systemAlt = ChatMessage.CreateSystemMessage("You are a helpful assistant.");
ChatMessage userAlt = ChatMessage.CreateUserMessage("Explain LINQ.");

Both styles compile and behave identically. The constructor style is more idiomatic in v2 code.

Before and after a multi-turn conversation:

// v1
var history = new List<ChatRequestMessage>
{
    new ChatRequestSystemMessage("You help with C# questions."),
    new ChatRequestUserMessage("What is a record type?"),
    new ChatRequestAssistantMessage("A record is an immutable reference type..."),
    new ChatRequestUserMessage("Can records be mutable?")
};

ChatCompletionsOptions options = new()
{
    DeploymentName = "gpt4o",
    Messages = history
};
Response<ChatCompletions> response = await client.GetChatCompletionsAsync(options);
// v2
var history = new List<ChatMessage>
{
    new SystemChatMessage("You help with C# questions."),
    new UserChatMessage("What is a record type?"),
    new AssistantChatMessage("A record is an immutable reference type..."),
    new UserChatMessage("Can records be mutable?")
};

// No DeploymentName on options — it moved to GetChatClient()
ClientResult<ChatCompletion> result = await chatClient.CompleteChatAsync(history);
string reply = result.Value.Content[0].Text;

5. Streaming Changes

Streaming is where the API surface changed the most.

v1 — GetChatCompletionsStreamingAsync

using Azure.AI.OpenAI;

// v1 streaming was called on OpenAIClient
StreamingResponse<StreamingChatCompletionsUpdate> streamingResponse =
    await client.GetChatCompletionsStreamingAsync(options);

await foreach (StreamingChatCompletionsUpdate update in streamingResponse)
{
    if (update.ContentUpdate != null)
    {
        Console.Write(update.ContentUpdate);
    }
}

v2 — CompleteChatStreamingAsync

using Azure.AI.OpenAI;
using OpenAI.Chat;

ChatClient chatClient = azureClient.GetChatClient("my-gpt4o-deployment");

List<ChatMessage> messages =
[
    new SystemChatMessage("Be concise."),
    new UserChatMessage("Explain async/await in C#.")
];

// v2: called on ChatClient, not on AzureOpenAIClient
AsyncCollectionResult<StreamingChatCompletionUpdate> streamingResult =
    chatClient.CompleteChatStreamingAsync(messages);

await foreach (StreamingChatCompletionUpdate update in streamingResult)
{
    foreach (ChatMessageContentPart part in update.ContentUpdate)
    {
        Console.Write(part.Text);
    }
}

Key differences:

  • Called on ChatClient, not on AzureOpenAIClient
  • Returns AsyncCollectionResult<StreamingChatCompletionUpdate> (singular Update, not Updates)
  • Content tokens are ChatMessageContentPart objects accessed via .Text — not a raw string on the update
  • No await needed on CompleteChatStreamingAsync() itself — the enumeration is deferred

For a complete streaming endpoint implementation with SSE and conversation history, see the Azure OpenAI Streaming Chat API workshop.

6. Function Calling

Function calling was substantially simplified in v2. FunctionDefinition is replaced by ChatTool.

v1 — FunctionDefinition

using Azure.AI.OpenAI;

var getWeatherFunction = new FunctionDefinition("get_current_weather")
{
    Description = "Get the current weather for a location",
    Parameters = BinaryData.FromObjectAsJson(new
    {
        type = "object",
        properties = new
        {
            location = new { type = "string", description = "City and state" },
            unit = new { type = "string", @enum = new[] { "celsius", "fahrenheit" } }
        },
        required = new[] { "location" }
    })
};

ChatCompletionsOptions options = new()
{
    DeploymentName = "gpt4o",
    Messages = { new ChatRequestUserMessage("What is the weather in Seattle?") },
    Tools = { new ChatCompletionsFunctionToolDefinition(getWeatherFunction) },
    ToolChoice = ChatCompletionsToolChoice.Auto
};

v2 — ChatTool.CreateFunctionTool

using OpenAI.Chat;

// v2: Create a ChatTool directly — no FunctionDefinition wrapper needed
ChatTool getWeatherTool = ChatTool.CreateFunctionTool(
    functionName: "get_current_weather",
    functionDescription: "Get the current weather for a location",
    functionParameters: BinaryData.FromObjectAsJson(new
    {
        type = "object",
        properties = new
        {
            location = new { type = "string", description = "City and state" },
            unit = new { type = "string", @enum = new[] { "celsius", "fahrenheit" } }
        },
        required = new[] { "location" }
    }));

ChatCompletionOptions options = new()
{
    Tools = { getWeatherTool },
    ToolChoice = ChatToolChoice.Auto
};

List<ChatMessage> messages = [new UserChatMessage("What is the weather in Seattle?")];
ClientResult<ChatCompletion> result = await chatClient.CompleteChatAsync(messages, options);

To handle the tool call response and return the result:

ChatCompletion completion = result.Value;

if (completion.FinishReason == ChatFinishReason.ToolCalls)
{
    // Process tool calls
    foreach (ChatToolCall toolCall in completion.ToolCalls)
    {
        string toolName = toolCall.FunctionName;
        string toolArgs = toolCall.FunctionArguments.ToString();

        // Call your actual function, get the result
        string toolResult = CallWeatherApi(toolArgs);

        // Add tool result back to the conversation
        messages.Add(new AssistantChatMessage(completion));
        messages.Add(new ToolChatMessage(toolCall.Id, toolResult));
    }

    // Continue the conversation
    result = await chatClient.CompleteChatAsync(messages, options);
}

7. Authentication Patterns

Authentication uses the same concepts but with the updated client type.

API Key (AzureKeyCredential)

using Azure;
using Azure.AI.OpenAI;

// v1
var v1Client = new OpenAIClient(
    new Uri(endpoint),
    new AzureKeyCredential(apiKey));

// v2 — same credential, different client class
var v2Client = new AzureOpenAIClient(
    new Uri(endpoint),
    new AzureKeyCredential(apiKey));

Managed Identity / DefaultAzureCredential

using Azure.AI.OpenAI;
using Azure.Identity;

// v1
var v1Client = new OpenAIClient(
    new Uri(endpoint),
    new DefaultAzureCredential());

// v2 — same pattern
var v2Client = new AzureOpenAIClient(
    new Uri(endpoint),
    new DefaultAzureCredential());

Configuring client options (retry, API version)

using Azure.AI.OpenAI;
using System.ClientModel.Primitives;

// v2 client options
var options = new AzureOpenAIClientOptions
{
    RetryPolicy = new ClientRetryPolicy(maxRetries: 5)
};

var client = new AzureOpenAIClient(
    new Uri(endpoint),
    new AzureKeyCredential(apiKey),
    options);

For production workloads with sustained traffic, the built-in ClientRetryPolicy handles 429 responses with exponential backoff — but for circuit breaking and advanced resilience, see the guide on fixing Azure OpenAI 429 Too Many Requests in .NET.

For Polly-based resilience pipelines layered on top of the SDK, see Add Resilience to AI Calls in .NET with Polly.

8. Compile Error to Fix Reference Table

When you upgrade the package and rebuild, the compiler will report errors on every v1 type. Use this table to resolve them systematically:

v1 Compile Errorv2 Fix
ChatRequestUserMessage not foundUse new UserChatMessage(text) from OpenAI.Chat
ChatRequestAssistantMessage not foundUse new AssistantChatMessage(text)
ChatRequestSystemMessage not foundUse new SystemChatMessage(text)
ChatCompletionsOptions not foundUse ChatCompletionOptions (no ‘s’)
ChatCompletions not foundUse ChatCompletion (singular)
Response<ChatCompletions> not foundUse ClientResult<ChatCompletion>
OpenAIClient not foundUse AzureOpenAIClient for Azure endpoints
FunctionDefinition not foundUse ChatTool.CreateFunctionTool(name, description, parameters)
ChatChoice not foundUse completion.Choices[0] directly (type is ChatChoice but access changed)
GetChatCompletionsStreamingAsync not foundUse chatClient.CompleteChatStreamingAsync() on ChatClient
StreamingChatCompletionsUpdate not foundUse StreamingChatCompletionUpdate (singular, no ‘s’)
update.ContentUpdate is a stringIt is now IReadOnlyList<ChatMessageContentPart> — use part.Text
ChatCompletionsFunctionToolDefinition not foundUse ChatTool.CreateFunctionTool(...) directly
ChatCompletionsToolChoice not foundUse ChatToolChoice.Auto
response.Value.Choices[0].Message.Content is a stringIt is now IReadOnlyList<ChatMessageContentPart> — use completion.Content[0].Text
.Usage.PromptTokens not foundUse .Usage.InputTokenCount
.Usage.CompletionTokens not foundUse .Usage.OutputTokenCount

After resolving all compile errors, verify token usage logging if your application tracks costs:

// v1 token properties
int prompt = response.Value.Usage.PromptTokens;
int completion = response.Value.Usage.CompletionTokens;

// v2 token properties — renamed
ChatCompletion result = clientResult.Value;
int inputTokens  = result.Usage.InputTokenCount;
int outputTokens = result.Usage.OutputTokenCount;
int totalTokens  = result.Usage.TotalTokenCount;

Summary

The v1-to-v2 migration in Azure.AI.OpenAI is extensive but mechanical. Every change follows a consistent pattern:

  • OpenAIClientAzureOpenAIClient → call .GetChatClient(deployment) to get a ChatClient
  • All ChatRequest* message types → *ChatMessage types from OpenAI.Chat
  • ChatCompletions (plural) → ChatCompletion (singular), wrapped in ClientResult<T>
  • Streaming: GetChatCompletionsStreamingAsync on the main client → CompleteChatStreamingAsync on ChatClient
  • Function calling: FunctionDefinitionChatTool.CreateFunctionTool
  • Options class: ChatCompletionsOptionsChatCompletionOptions (drop the extra ‘s’)

The biggest benefit of the upgrade is access to the shared OpenAI .NET SDK type surface — future features from OpenAI’s official SDK land in your Azure workloads faster, without waiting for Azure-specific wrappers.

Start with npm run build after updating the package. Treat each compiler error as a checklist item using the table in section 8. Most migrations complete within one build cycle.

⚠ Production Considerations

  • The ClientResult<T> wrapper is easy to miss — if you call CompleteChatAsync and get a compilation error on accessing .Choices or .Content directly, you need .Value first. Forgetting .Value is the most common runtime mistake after the initial compile-error fixes.
  • Under sustained load, the v2 SDK's built-in retry handles 429s but does not implement circuit breaking. For production workloads, layer Polly resilience on top of the SDK defaults to prevent quota exhaustion cascades.

Enjoying this article?

Get weekly .NET + AI insights delivered to your inbox. No spam.

Subscribe Free →

🧠 Architect’s Note

The v2 package split — Azure.AI.OpenAI as the Azure auth/endpoint layer over the canonical OpenAI .NET SDK — is a deliberate alignment with OpenAI's official SDK surface. This means any non-Azure OpenAI feature added to the official SDK (realtime audio, structured outputs) becomes available in Azure workloads faster. Design your service abstractions against ChatClient rather than AzureOpenAIClient so the auth layer stays swappable.

AI-Friendly Summary

Summary

Azure.AI.OpenAI v2 restructures the .NET Azure OpenAI SDK as a thin wrapper over the official OpenAI .NET SDK. The primary change is that AzureOpenAIClient replaces OpenAIClient, and all chat types (ChatMessage, ChatCompletion, ChatClient) now come from the OpenAI.Chat namespace. ChatRequestUserMessage becomes UserChatMessage, ChatCompletions becomes ChatCompletion (singular), and streaming now uses CompleteChatStreamingAsync returning AsyncCollectionResult<StreamingChatCompletionUpdate>. Function calling uses ChatTool.CreateFunctionTool instead of FunctionDefinition. The migration is mechanical but comprehensive — nearly every type name changed.

Key Takeaways

  • AzureOpenAIClient (from Azure.AI.OpenAI) replaces OpenAIClient — call .GetChatClient(deployment) to get a ChatClient
  • Chat types are now in OpenAI.Chat: UserChatMessage, AssistantChatMessage, SystemChatMessage replace the ChatRequest* types
  • ChatCompletion (singular) replaces ChatCompletions (plural) — access the response with ClientResult<ChatCompletion>.Value
  • Streaming uses CompleteChatStreamingAsync() returning AsyncCollectionResult<StreamingChatCompletionUpdate>
  • ChatTool.CreateFunctionTool(name, description, BinaryData) replaces FunctionDefinition for function calling
  • ChatCompletionOptions replaces ChatCompletionsOptions — note the dropped 's'

Implementation Checklist

  • Update package reference: Azure.AI.OpenAI to version 2.1.0 (OpenAI 2.1.0 is included transitively)
  • Replace OpenAIClient with AzureOpenAIClient and call .GetChatClient(deploymentName)
  • Replace ChatRequestUserMessage with UserChatMessage, ChatRequestSystemMessage with SystemChatMessage, ChatRequestAssistantMessage with AssistantChatMessage
  • Replace ChatCompletionsOptions with ChatCompletionOptions and update changed property names
  • Update result access: Response<ChatCompletions> becomes ClientResult<ChatCompletion>, access value via .Value
  • Update streaming: GetChatCompletionsStreamingAsync() becomes CompleteChatStreamingAsync() on ChatClient
  • Replace FunctionDefinition with ChatTool.CreateFunctionTool and pass JSON schema as BinaryData
  • Run build and use the compile error table in this article to fix each remaining error

Frequently Asked Questions

What is the biggest architectural change in Azure.AI.OpenAI v2?

In v2, Azure.AI.OpenAI is a thin Azure-specific wrapper over the official OpenAI .NET SDK. The OpenAI package is now a required dependency and provides the core types (ChatMessage, ChatCompletion, ChatClient). Azure.AI.OpenAI adds AzureOpenAIClient, AzureKeyCredential support, and Azure-specific options. This means you import from both namespaces.

Do I need to install both Azure.AI.OpenAI and OpenAI packages?

You only need to add the Azure.AI.OpenAI package explicitly. The OpenAI package is a transitive dependency pulled in automatically. However, you will reference types from both namespaces in your code — Azure.AI.OpenAI for AzureOpenAIClient and AzureKeyCredential, and OpenAI.Chat for ChatClient, ChatMessage subtypes, and ChatCompletion.

What replaced ChatRequestUserMessage in Azure.AI.OpenAI v2?

ChatRequestUserMessage, ChatRequestAssistantMessage, and ChatRequestSystemMessage are all gone. They are replaced by UserChatMessage, AssistantChatMessage, and SystemChatMessage from the OpenAI.Chat namespace. You can also use the static factory methods on ChatMessage: ChatMessage.CreateUserMessage(text) and ChatMessage.CreateSystemMessage(text).

How does streaming work differently in Azure.AI.OpenAI v2?

In v1 you called client.GetChatCompletionsStreamingAsync() on the OpenAIClient. In v2 you call chatClient.CompleteChatStreamingAsync() on a ChatClient obtained via azureClient.GetChatClient(deploymentName). The return type is AsyncCollectionResult<StreamingChatCompletionUpdate>, which you iterate with await foreach.

How does function calling change in Azure.AI.OpenAI v2?

FunctionDefinition is replaced by ChatTool. Use ChatTool.CreateFunctionTool(name, description, parameters) where parameters is a BinaryData containing the JSON schema for your function arguments. Pass tools to ChatCompletionOptions.Tools (was ChatCompletionsOptions.Tools in v1).

Can I still use DefaultAzureCredential for passwordless auth in v2?

Yes. AzureOpenAIClient has an overload that accepts a TokenCredential: new AzureOpenAIClient(new Uri(endpoint), new DefaultAzureCredential()). This is the recommended approach for production workloads using managed identity. The Azure.Identity package version requirement has not changed.

What is ChatCompletionOptions and how is it different from ChatCompletionsOptions?

ChatCompletionOptions (v2) is the renamed equivalent of ChatCompletionsOptions (v1). The key difference besides the name is the property names — MaxTokens is now MaxOutputTokenCount, and the Tools property accepts ChatTool objects instead of ChatCompletionsFunctionToolDefinition. Always check the IntelliSense after migrating because several property names changed.

Track your progress through this learning path.

You Might Also Enjoy

Was this article useful?

Feedback is anonymous and helps us improve content quality.

Discussion

Engineering discussion powered by GitHub Discussions.

#Azure OpenAI #SDK v2 #Migration #Breaking Changes #.NET AI