Skip to main content

What Is Semantic Kernel? Beginner Guide for .NET Developers

Verified Apr 2026 Beginner Original .NET 10 Microsoft.SemanticKernel 1.71.0
By Rajesh Mishra · Mar 9, 2026 · 12 min read
In 30 Seconds

Semantic Kernel is Microsoft's open-source AI orchestration SDK for .NET. It provides plugins (reusable AI tools), memory (vector store integration), and planning (multi-step AI workflows) for C# applications. In the 2026 Microsoft AI stack, SK sits above Microsoft.Extensions.AI and below Microsoft Agent Framework. SK is the right choice for most .NET AI integration scenarios; Agent Framework extends it for autonomous multi-agent systems.

What Semantic Kernel Actually Is

Semantic Kernel is Microsoft’s open-source SDK for building AI-powered applications in .NET. It provides the glue between your C# code and AI models — handling the orchestration, tool invocation, and memory management that every AI application needs.

In practical terms, when you want your .NET application to chat with GPT-5 or another current production model, call functions based on user intent, remember context across conversations, or orchestrate multi-step AI workflows, Semantic Kernel is the SDK that makes this happen.

Microsoft built it because .NET developers needed a first-class AI orchestration library that works with their existing tools — Visual Studio, NuGet, dependency injection, async/await. Before SK, integrating AI into .NET meant either using Python libraries through interop or writing raw HTTP calls to OpenAI APIs. Neither approach was acceptable for production C# codebases.

Where Semantic Kernel Fits in 2026

The Microsoft AI stack has a clear hierarchy, and understanding where SK sits prevents confusion about which tool to use when:

Your .NET ApplicationMicrosoft.Extensions.AIIChatClient · IEmbeddingGeneratorSemantic KernelPlugins · Memory · PlanningMicrosoft Agent FrameworkAgents · Orchestration · MCPAzure OpenAI / Azure AI Search
The Microsoft AI stack — each layer builds on the one below. SK sits between the provider abstraction (Microsoft.Extensions.AI) and the agentic layer (Agent Framework).

Microsoft.Extensions.AI defines the interfaces — IChatClient, IEmbeddingGenerator. Every AI provider implements these. You can swap Azure OpenAI for Ollama in one line because both implement the same interface.

Semantic Kernel uses those interfaces and adds orchestration on top — plugins that expose your C# methods to the LLM, memory that persists context across conversations, and planning that chains multiple steps together.

Microsoft Agent Framework extends SK for autonomous agents — multi-agent orchestration, checkpointing, human-in-the-loop, MCP integration. If you need agents that operate independently, delegate tasks to other agents, or manage long-running workflows, Agent Framework is the layer to add.

The key insight: SK is not being replaced by Agent Framework. They serve different purposes. Most .NET AI applications — chat features, RAG implementations, function calling, content generation — work perfectly with SK alone. Agent Framework is specifically for agentic patterns. You only need it when you need agents.

The Four Core Concepts

1. The Kernel

The kernel is the central hub. It holds your AI model connections, registered plugins, and configuration. Every AI operation flows through the kernel:

using Microsoft.SemanticKernel;

var builder = Kernel.CreateBuilder();

builder.AddAzureOpenAIChatCompletion(
    deploymentName: "chat-deployment",
    endpoint: "https://your-resource.openai.azure.com/",
    apiKey: "your-api-key");

var kernel = builder.Build();

Once built, you can invoke the kernel for chat completions, function calling, or multi-step plans. Think of it as the DI container for your AI capabilities.

2. Plugins

Plugins are how you give the LLM access to your code. Any C# method can become a plugin by adding the [KernelFunction] attribute:

public class OrderPlugin
{
    private readonly IOrderService _orderService;

    public OrderPlugin(IOrderService orderService)
    {
        _orderService = orderService;
    }

    [KernelFunction("get_order_status")]
    [Description("Get the current status of a customer order by order ID")]
    public async Task<string> GetOrderStatusAsync(int orderId)
    {
        var order = await _orderService.GetAsync(orderId);
        return $"Order {orderId}: {order.Status}, shipped {order.ShipDate:d}";
    }

    [KernelFunction("cancel_order")]
    [Description("Cancel an existing order if it hasn't shipped yet")]
    public async Task<string> CancelOrderAsync(int orderId)
    {
        var result = await _orderService.CancelAsync(orderId);
        return result ? $"Order {orderId} cancelled." : $"Order {orderId} cannot be cancelled — already shipped.";
    }
}

Register the plugin with the kernel, enable automatic function calling, and the LLM will invoke your methods when the user’s intent matches:

builder.Plugins.AddFromType<OrderPlugin>();
var kernel = builder.Build();

var settings = new OpenAIPromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

var result = await kernel.InvokePromptAsync(
    "What's the status of order 12345?", new(settings));

The LLM sees the function descriptions, determines that get_order_status is relevant, calls it with orderId: 12345, and incorporates the result into its response. Your users get answers grounded in real data, not hallucinated responses.

3. Memory

Memory lets your AI application remember things across conversations. SK supports both short-term (chat history) and long-term (vector store) memory:

Chat History — Keep recent conversation context:

var chatHistory = new ChatHistory();
chatHistory.AddSystemMessage("You are a helpful .NET development assistant.");
chatHistory.AddUserMessage("What's new in .NET 9?");
// ... response
chatHistory.AddUserMessage("What about performance improvements?");
// The AI remembers the previous question

Vector Store Memory — Query your own documents:

var memoryBuilder = new MemoryBuilder();
memoryBuilder.WithAzureOpenAITextEmbeddingGeneration("text-embedding-3-small", endpoint, apiKey);
memoryBuilder.WithMemoryStore(new AzureAISearchMemoryStore(searchEndpoint, searchKey));
var memory = memoryBuilder.Build();

// Store a document
await memory.SaveInformationAsync("docs", "Semantic Kernel supports .NET 8+", "doc-1");

// Query later
var results = await memory.SearchAsync("docs", "what .NET versions work?");

4. Functions and Planning

Functions are the callable units in SK. They come in two types:

  • Native Functions — Your C# code wrapped as plugins (shown above)
  • Prompt Functions — Templated prompts that SK executes against the LLM
// Prompt function defined inline
var summarize = kernel.CreateFunctionFromPrompt(
    "Summarize the following text in 3 bullet points:\n\n{{$input}}");

var summary = await kernel.InvokeAsync(summarize,
    new() { ["input"] = longArticleText });

Planning combines multiple functions into a sequence. SK’s planner analyzes the user’s goal, identifies which functions to call, determines the order, and executes the plan:

var settings = new OpenAIPromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

// The LLM may call multiple plugins in sequence to answer
var result = await kernel.InvokePromptAsync(
    "Find order 12345, check if it's shippable, and generate a shipping label.",
    new(settings));

Semantic Kernel vs Other Options

SK vs LangChain

LangChain is Python-first. If your team writes C# and deploys to Azure, LangChain means adding Python infrastructure, interop complexity, and a different deployment model. Semantic Kernel gives you the same capabilities — plugins, memory, chains — with native .NET support, NuGet distribution, and DI integration.

For a detailed comparison, see Semantic Kernel vs LangChain: The .NET Developer’s Guide.

SK vs Direct API Calls

You can call OpenAI’s API directly with HttpClient. For simple chat completion, that’s fine. But the moment you need function calling with automatic tool invocation, conversation memory, or multi-step workflows, you’re rebuilding what SK already provides. SK handles the orchestration loop — tool schema generation, response parsing, function dispatch, result injection — so you focus on your business logic.

SK vs Microsoft.Extensions.AI

They’re complementary, not competing. Microsoft.Extensions.AI defines the provider-agnostic interfaces (IChatClient, IEmbeddingGenerator). SK builds on top of those interfaces to add plugins, memory, and planning. Use Microsoft.Extensions.AI if you only need raw chat completion with DI support. Use SK when you need orchestration.

Getting Started

Install the SDK:

dotnet add package Microsoft.SemanticKernel

For Azure OpenAI connectivity:

dotnet add package Microsoft.SemanticKernel.Connectors.AzureOpenAI

Build your first kernel and make a chat completion call:

using Microsoft.SemanticKernel;

var kernel = Kernel.CreateBuilder()
    .AddAzureOpenAIChatCompletion(
    deploymentName: "chat-deployment",
        endpoint: Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!,
        apiKey: Environment.GetEnvironmentVariable("AZURE_OPENAI_KEY")!)
    .Build();

var response = await kernel.InvokePromptAsync("Explain dependency injection in 2 sentences.");
Console.WriteLine(response);

That’s it. From here, you add plugins for your domain, connect memory for persistent context, and build the AI features your application needs.

Next Steps

⚠ Production Considerations

  • Don't create multiple Kernel instances per request — use DI to register a singleton or scoped kernel builder and build per-request kernels from it.
  • Prompt template rendering is synchronous. For templates with many variable substitutions, consider pre-computing values to avoid blocking.

Enjoying this article?

Get weekly .NET + AI insights delivered to your inbox. No spam.

Subscribe Free →

🧠 Architect’s Note

Semantic Kernel is the stable orchestration layer in the Microsoft AI stack. Even as Agent Framework gains adoption, SK remains the right tool for non-agentic scenarios. Design your plugins as SK KernelFunctions — they'll work in SK, Agent Framework, and future Microsoft AI tooling.

AI-Friendly Summary

Summary

Semantic Kernel is Microsoft's open-source AI orchestration SDK for .NET. It provides plugins (reusable AI tools), memory (vector store integration), and planning (multi-step AI workflows) for C# applications. In the 2026 Microsoft AI stack, SK sits above Microsoft.Extensions.AI and below Microsoft Agent Framework. SK is the right choice for most .NET AI integration scenarios; Agent Framework extends it for autonomous multi-agent systems.

Key Takeaways

  • Semantic Kernel is Microsoft's AI orchestration SDK for .NET — open source, MIT licensed
  • Core concepts: Kernel (central hub), Plugins (tools), Memory (vector stores), Functions (callable units)
  • SK sits between Microsoft.Extensions.AI (abstraction) and Agent Framework (agentic layer)
  • Works with Azure OpenAI, OpenAI, Ollama, and any IChatClient-compatible provider
  • Plugins written for SK transfer directly to Microsoft Agent Framework

Implementation Checklist

  • Install Microsoft.SemanticKernel NuGet package
  • Create a Kernel instance with AI model configuration
  • Add plugins using KernelFunction attribute
  • Enable automatic function calling for tool use
  • Connect memory for persistent context (optional)

Frequently Asked Questions

What is Semantic Kernel in simple terms?

Semantic Kernel is Microsoft's open-source SDK that lets .NET developers integrate AI capabilities — chat, reasoning, tool use, memory — into their applications using C#. It handles the orchestration between your code and modern AI models, providing plugins, memory, and planning capabilities.

Is Semantic Kernel the same as LangChain?

They solve similar problems but for different ecosystems. LangChain targets Python developers. Semantic Kernel targets .NET/C# developers. Architecturally, SK uses a plugin model with native DI support, while LangChain uses chains and agents. For .NET teams, SK is the better fit due to its integration with Azure OpenAI, Microsoft.Extensions.AI, and the broader .NET ecosystem.

Should I learn Semantic Kernel or Microsoft Agent Framework?

Learn both, but start with Semantic Kernel. Agent Framework is built on top of SK — understanding SK plugins, kernel configuration, and function calling makes learning Agent Framework straightforward. If you're building non-agentic AI features (chat, RAG, function calling), SK is all you need. If you need autonomous multi-agent systems, learn SK first, then add Agent Framework.

Does Semantic Kernel require Azure?

No. Semantic Kernel works with Azure OpenAI, OpenAI directly, Ollama for local models, and any provider that implements the IChatClient interface from Microsoft.Extensions.AI. Azure OpenAI is the most common production choice for enterprise teams, but it's not required.

Is Semantic Kernel free?

Yes. Semantic Kernel is open-source under the MIT license. The SDK itself is free. You pay for the AI model provider you connect it to (Azure OpenAI, OpenAI, etc.). For local models via Ollama, even the model inference is free.

Track your progress through this learning path.

You Might Also Enjoy

Was this article useful?

Feedback is anonymous and helps us improve content quality.

Discussion

Engineering discussion powered by GitHub Discussions.

#Semantic Kernel #.NET AI #Azure OpenAI #AI Orchestration #Getting Started