Note
This sample assumes you're using a GitHub Codespaces instance. If you want to run this locally, you need to set up a personal access token (PAT) on GitHub.
# zsh/bash
export GITHUB_TOKEN="{{YOUR_GITHUB_PAT}}"# PowerShell
$env:GITHUB_TOKEN = "{{YOUR_GITHUB_PAT}}"dotnet restoreShould install the following libraries: Azure AI Inference, Azure Identity, Microsoft.Extension, Model.Hosting, ModelContextProtcol
dotnet runYou should see an output similar to:
Setting up stdio transport
Listing tools
Connected to server with tools: Add
Tool description: Adds two numbers
Tool parameters: {"title":"Add","description":"Adds two numbers","type":"object","properties":{"a":{"type":"integer"},"b":{"type":"integer"}},"required":["a","b"]}
Tool definition: Azure.AI.Inference.ChatCompletionsToolDefinition
Properties: {"a":{"type":"integer"},"b":{"type":"integer"}}
MCP Tools def: 0: Azure.AI.Inference.ChatCompletionsToolDefinition
Tool call 0: Add with arguments {"a":2,"b":4}
Sum 6
A lot of the output us just debugging but what's important is that you are listing tools from the MCP Server, turn those into LLM tools and you end up with an MCP client response "Sum 6".