TL;DR
Microsoft Agent Framework (MAF) merges Semantic Kernel and AutoGen into a single, production-ready agent runtime built on Microsoft.Extensions.AI. This post walks through five progressive samples — creating your first agent, adding tools, composing agents, multi-turn conversations, and persistent memory — all running as single-file C# scripts with dotnet run.
- TL;DR
- Introduction — What is MAF?
- Your first agent
- Tools — Giving agents the ability to act
- Multi-turn conversations
- Memory and persistence
- Key takeaways
- What’s next
- Presentation
- References
Source code: https://github.com/NikiforovAll/maf-getting-started
Introduction — What is MAF?
.NET had two AI agent frameworks from Microsoft: Semantic Kernel for enterprise orchestration and AutoGen for multi-agent research. Two ecosystems, overlapping goals, confusion about which to pick. MAF unifies them into one framework.
| Before | After |
|---|---|
| Semantic Kernel — enterprise AI orchestration | Built on Microsoft.Extensions.AI abstractions |
| AutoGen — multi-agent research framework | Microsoft.Agents.AI — unified agent runtime |
| Two ecosystems, overlapping goals | Single API for single & multi-agent scenarios |
1️⃣ The architecture is layered:
| Layer | Components |
|---|---|
| Your Application | AIAgent, Tools, Sessions, Workflows |
| Microsoft.Agents.AI | Unified agent runtime |
| Microsoft.Extensions.AI | IChatClient, AIFunction |
| Providers | Azure OpenAI, OpenAI, Ollama, ... |
2️⃣ The core concepts:
| Concept | Type | Purpose |
|---|---|---|
| AIAgent | IAIAgent | Core agent abstraction |
| Tools | AIFunction | Functions the agent can call |
| Session | AgentSession | Conversation state & history |
| Run | RunAsync / RunStreamingAsync | Execute agent with input |
| Workflow | WorkflowBuilder | Multi-agent orchestration |
Your first agent
All samples use .NET 10’s run-file feature — single .cs files with #:package directives, no .csproj needed:
export AZURE_OPENAI_ENDPOINT="https://your-resource.cognitiveservices.azure.com/"
export AZURE_OPENAI_DEPLOYMENT_NAME="gpt-4o-mini"
dotnet run src/01-hello-agent.cs
Here’s the full agent — 38 lines from zero to running:
#:package Microsoft.Agents.AI.OpenAI@1.0.0-rc2
#:package Azure.AI.OpenAI@2.8.0-beta.1
#:package Azure.Identity@1.18.0
using Azure.AI.OpenAI;
using Azure.Identity;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
using OpenAI.Chat;
var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
var deploymentName = Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT_NAME");
AIAgent agent = new AzureOpenAIClient(new Uri(endpoint!), new DefaultAzureCredential())
.GetChatClient(deploymentName)
.AsAIAgent(
new ChatClientAgentOptions
{
Name = "HelloAgent",
ChatOptions = new ChatOptions
{
Instructions = "You are a friendly assistant. Keep your answers brief.",
Temperature = 0.9f,
},
}
);
// Non-streaming
Console.WriteLine(await agent.RunAsync("Tell me a one-sentence fun fact."));
// Streaming — process tokens as they arrive
await foreach (var update in agent.RunStreamingAsync("Tell me a one-sentence fun fact."))
{
Console.WriteLine(update);
}
The pipeline is straightforward:
| Step | Call | Role |
|---|---|---|
| 1 | new AzureOpenAIClient(...) | Azure OpenAI provider |
| 2 | .GetChatClient("gpt-4o-mini") | ChatClient via IChatClient |
| 3 | .AsAIAgent(options) | AIAgent from MAF |
| 4 | .RunAsync("prompt") | Execute and get response |
AsAIAgent() is the key extension method. It turns any IChatClient into a full agent. Because it’s built on IChatClient, you can swap the provider (Azure OpenAI, OpenAI, Ollama) without touching agent code.
Tools — Giving agents the ability to act
Function tools
Define a plain C# method with [Description] attributes and register it via AIFunctionFactory.Create():
[Description("Get the weather for a given location.")]
static string GetWeather(
[Description("The location to get the weather for.")] string location) =>
$"The weather in {location} is cloudy with a high of 15°C.";
AIAgent weatherAgent = client
.GetChatClient(deploymentName)
.AsAIAgent(
instructions: "You are a helpful weather assistant.",
name: "WeatherAgent",
description: "An agent that answers weather questions.",
tools: [AIFunctionFactory.Create(GetWeather)]
);
Console.WriteLine(await weatherAgent.RunAsync("What is the weather like in Amsterdam?"));
How tool calling works:
- User sends: “What’s the weather in Amsterdam?”
- LLM decides to call
GetWeather("Amsterdam") - MAF invokes the C# method automatically
- Result is fed back to the LLM
- LLM generates the final answer using the tool result
The [Description] attributes are sent to the LLM as the tool schema — write clear, specific descriptions because that’s all the model sees when deciding which tool to call.
Agent-as-tool
Any agent can become a tool for another agent via .AsAIFunction():
AIAgent orchestrator = client
.GetChatClient(deploymentName)
.AsAIAgent(
instructions: "You are a helpful assistant. the weather agent when asked about weather.",
tools: [weatherAgent.AsAIFunction()]
);
Console.WriteLine(await orchestrator.RunAsync("What's the weather in Amsterdam and Paris?"));
The orchestrator delegates to WeatherAgent when it encounters weather questions. The weather agent, in turn, calls its own GetWeather tool for each location.
Each agent has its own LLM call — be mindful of latency and cost when nesting agents.
Multi-turn conversations
Without a session, each RunAsync call is stateless — the agent has no memory of prior turns. AgentSession holds the conversation thread:
AIAgent agent = new AzureOpenAIClient(new Uri(endpoint!), new DefaultAzureCredential())
.GetChatClient(deploymentName)
.AsAIAgent(
instructions: """You are a friendly assistant. Keep your answers brief.
And always remember the information the user shares with you
during the conversation.
""",
name: "ConversationAgent"
);
AgentSession session = await agent.CreateSessionAsync();
// Turn 1 — introduce context
Console.WriteLine(await agent.RunAsync("My name is Alice and I love hiking.", session));
// Turn 2 — agent remembers from session history
Console.WriteLine(await agent.RunAsync("What do you remember about me?", session));
// Turn 3 — agent uses accumulated context
Console.WriteLine(await agent.RunAsync("Suggest a hiking destination for me.", session));
Under the hood, the session accumulates the full conversation and sends it with each LLM call:
| Turn | Input | ChatHistory contents |
|---|---|---|
| created | — | [] (empty) |
| 1 | "My name is Alice..." | [system, user₁, assistant₁] |
| 2 | "What do you remember?" | [system, user₁, assistant₁, user₂, assistant₂] |
| 3 | "Suggest a destination" | [system, user₁, assistant₁, user₂, assistant₂, user₃, assistant₃] |
The default provider is InMemoryChatHistoryProvider — zero config, but lost on process restart.
Memory and persistence
In-memory and serialization
The default InMemoryChatHistoryProvider works for single-process scenarios. For persistence across restarts, MAF provides serialization:
AgentSession session = await agent.CreateSessionAsync();
await agent.RunAsync("My name is Alice", session);
// Serialize session to JSON
var serialized = await agent.SerializeSessionAsync(session);
Console.WriteLine($"Session serialized ({serialized.GetRawText().Length} bytes)");
// Store anywhere — database, file, Redis, blob storage
// Restore session from serialized data
var restoredSession = await agent.DeserializeSessionAsync(serialized);
// Agent remembers everything from the original session
await agent.RunAsync("Do you still remember my name?", restoredSession);
// → "Yes, your name is Alice!"
SerializeSessionAsync / DeserializeSessionAsync give you portable session state. Export to JSON, store it however you want, restore later.
Custom ChatHistoryProvider
For more control, implement ChatHistoryProvider directly. Here’s a simple file-backed provider:
sealed class FileChatHistoryProvider(string filePath) : ChatHistoryProvider
{
protected override ValueTask<IEnumerable<ChatMessage>> ProvideChatHistoryAsync(
InvokingContext context, CancellationToken cancellationToken = default)
{
if (!File.Exists(filePath))
return new(Enumerable.Empty<ChatMessage>());
var json = File.ReadAllText(filePath);
var messages = JsonSerializer.Deserialize(json, ChatHistoryJsonContext.Default.ListChatMessage) ?? [];
return new(messages.AsEnumerable());
}
protected override ValueTask StoreChatHistoryAsync(InvokedContext context, CancellationToken cancellationToken = default)
{
List<ChatMessage> existing = [];
if (File.Exists(filePath))
{
var json = File.ReadAllText(filePath);
existing = JsonSerializer.Deserialize(json, ChatHistoryJsonContext.Default.ListChatMessage) ?? [];
}
existing.AddRange(context.RequestMessages);
existing.AddRange(context.ResponseMessages ?? []);
File.WriteAllText(path: filePath, JsonSerializer.Serialize(existing, ChatHistoryJsonContext.Default.ListChatMessage));
return default;
}
}
Wire it up via ChatClientAgentOptions:
AIAgent agent = new AzureOpenAIClient(new Uri(endpoint!), new DefaultAzureCredential())
.GetChatClient(deploymentName)
.AsAIAgent(new ChatClientAgentOptions
{
Name = "PersistentAgent",
ChatOptions = new ChatOptions
{
Instructions = "You are a friendly assistant. Keep your answers brief.",
},
ChatHistoryProvider = new FileChatHistoryProvider(filePath),
});
This survives process restarts — create a new agent with the same file, and it picks up where it left off. The limitation: all sessions share the same file. Concurrent sessions will clobber each other’s history.
Session-aware ChatHistoryProvider
For per-session isolation, use ProviderSessionState<TState>. Each session gets its own file identified by a unique session ID:
sealed class FileChatHistoryProvider : ChatHistoryProvider
{
private readonly string _directory;
private readonly ProviderSessionState<SessionState> _sessionState;
public FileChatHistoryProvider(string directory, string? existingSessionId = null)
{
_directory = directory;
_sessionState = new ProviderSessionState<SessionState>(
_ => new SessionState(existingSessionId ?? Guid.NewGuid().ToString("N")[..8]),
nameof(FileChatHistoryProvider));
}
public string GetSessionId(AgentSession? session) =>
_sessionState.GetOrInitializeState(session).SessionId;
protected override ValueTask<IEnumerable<ChatMessage>> ProvideChatHistoryAsync(
InvokingContext context, CancellationToken cancellationToken = default)
{
var state = _sessionState.GetOrInitializeState(context.Session);
var path = Path.Combine(_directory, $"{state.SessionId}.json");
// ... read from session-specific file
}
// ... StoreChatHistoryAsync implementation similar to before, but using session-specific file
sealed class SessionState(string sessionId)
{
public string SessionId { get; } = sessionId;
}
}
To restore a session after restart, extract the session ID and pass it to a new provider:
var sessionId = historyProvider.GetSessionId(session);
var restoredProvider = new FileChatHistoryProvider(historyDir, sessionId);
AIAgent agent2 = client.GetChatClient(deploymentName)
.AsAIAgent(new ChatClientAgentOptions
{
ChatHistoryProvider = restoredProvider,
});
AgentSession session2 = await agent2.CreateSessionAsync();
await agent2.RunAsync("Do you remember my name?", session2); // → "Alice"
Key takeaways
AsAIAgent()— one extension method turns anyIChatClientinto a full agent[Description]+AIFunctionFactory.Create()— plain C# methods become LLM-callable tools.AsAIFunction()— any agent can become a tool for another agentAgentSession— pass toRunAsyncto maintain multi-turn conversation historyChatHistoryProvider— the extension point for production persistence
What’s next
Part 2 covers Workflows, MCP, and AG-UI — orchestrating multi-agent pipelines, exposing agents as MCP servers, and streaming to frontends via the AG-UI protocol.
Presentation
References
- Topics:
- dotnet (61) ·
- ai (16) ·
- dotnet (66) ·
- maf (1) ·
- agents (10) ·
- microsoft-extensions-ai (1) ·
- microsoft-agent-framework (1)