How to Use DeepSeek-V3 with .NET 8 (via Microsoft.Extensions.AI)
Table of Contents
Everyone is talking about DeepSeek-V3. It’s cheaper than GPT-4, arguably faster, and excellent at coding tasks. But if you look for tutorials, 99% of them are in Python.
As a .NET developer, you don't need to switch languages. Thanks to the new Microsoft.Extensions.AI library (released in preview), connecting to DeepSeek is standardized, clean, and production-ready.
This guide will show you exactly how to wire up DeepSeek-V3 in a .NET 8 Console Application using C#.
Prerequisites: The NuGet Packages
DeepSeek is "OpenAI Compatible," which means we can use the standard OpenAI client but point it to DeepSeek's servers. We will use Microsoft's new unified abstraction layer to keep our code clean.
Run the following commands in your terminal to install the required libraries:
dotnet add package Microsoft.Extensions.AI --prerelease
dotnet add package Microsoft.Extensions.AI.OpenAI --prerelease
dotnet add package Azure.AI.OpenAI --prerelease
The Code: Connecting to DeepSeek
Here is the complete, copy-paste ready C# code. This uses the IChatClient interface, which allows you to easily swap DeepSeek for GPT-4 or Llama later without changing your core logic.
Step 1: Configure the Client
Create a new file called Program.cs and add this logic:
using System.ClientModel;
using Azure.AI.OpenAI;
using Microsoft.Extensions.AI;
// 1. Setup Configuration
// Get your key from: https://platform.deepseek.com/
string apiKey = "sk-YOUR-DEEPSEEK-API-KEY";
string endpoint = "https://api.deepseek.com/v1";
string modelId = "deepseek-chat"; // V3 model name
// 2. Create the Source Client (OpenAI SDK)
// We point the endpoint to DeepSeek instead of OpenAI
var openAIClient = new OpenAIClient(
new ApiKeyCredential(apiKey),
new OpenAIClientOptions { Endpoint = new Uri(endpoint) }
);
// 3. Adapt to Microsoft.Extensions.AI
// This creates a standard IChatClient
IChatClient chatClient = openAIClient.AsChatClient(modelId);
// 4. Run the Prompt
Console.WriteLine("Sending request to DeepSeek...");
var response = await chatClient.CompleteAsync("Explain Dependency Injection in .NET 8 in one paragraph.");
Console.WriteLine($"\n[DeepSeek Response]:\n{response.Message.Text}");
Handling Streaming (Real-Time Responses)
DeepSeek is fast. To make your app feel responsive, you should use Streaming. Here is how to modify the code to print text as it arrives:
Console.WriteLine("Streaming response...");
await foreach (var chunk in chatClient.CompleteStreamingAsync("Write a C# Hello World program."))
{
Console.Write(chunk.Text);
}
Why Use 'Microsoft.Extensions.AI'?
You might ask: "Why not just use a simple HttpClient?"
Using Microsoft.Extensions.AI provides three massive benefits for Enterprise applications:
- Swappability: You can change the underlying model to Azure OpenAI or Ollama (Local) just by changing the initialization line. The rest of your app code remains the same.
- Pipeline Features: It has built-in support for Logging, Caching, and Telemetry (OpenTelemetry) which are critical for production.
- Standardization: It aligns with the future direction of the .NET ecosystem (Aspire, Semantic Kernel).
Common Errors & Fixes
1. 401 Unauthorized
- Cause: Your DeepSeek API key is invalid or you have no credits.
- Fix: Log in to platform.deepseek.com and check your balance. DeepSeek does not offer a free tier API by default; you need to load $5.
2. 404 Not Found
- Cause: Incorrect Endpoint URL.
- Fix: Ensure your URI is exactly
https://api.deepseek.com/v1orhttps://api.deepseek.comdepending on the SDK version. Do not add/chat/completionsto the base URI.
Conclusion
Integrating DeepSeek-V3 into .NET 8 is surprisingly simple thanks to its OpenAI compatibility. By wrapping it in Microsoft.Extensions.AI, you future-proof your application.
Next Steps: Try building a RAG (Retrieval Augmented Generation) app using this setup combined with a Vector Database like Qdrant.