- stdio.py: an example implementation of STDIO with a loop of user input and LLM output
- jsonrpc.py: an example implementation of JSON-RPC 2.0 using HTTP
- stdio: an example implementation of MCP server using STDIO transport
- httpstreamable: an example implementation of MCP server using HTTP Streamable transport
- stdio: an example implementation of MCP client using STDIO transport. Included wrappers of read and write streams to log the payloads before passing along.
- httpstreamable: an example implementation of MCP client using HTTP Streamable transport. Included wrappers of read and write streams to log the payloads before passing along.
- chatbot: an example chatbot application using function calling rather than MCP
- mcp_chatbot: an example chatbot application using MCP
Install uv if you have not
See installation methods
Upon uv installed, set up environment by:
uv sync
Create .env file and fill in the variables. See .env.example as an example.
In this repo we use Claude as LLM thus need to set up Anthropic API key.
If you have not got one, follow the steps:
- Create an Anthropic Account: If you don't have one, go to the Anthropic website and sign up for an account.
- Access the API Keys Section: Log in to your Anthropic API Keys page. Look for the "API Keys" or similar section, usually in the left-hand menu or profile settings.
- Generate a New API Key: Click the "Create Key" button. You may need to select a workspace or provide a name for the key to help identify it later.
- Copy and Use the API Key:
Anthropic will generate a unique API key. Copy this key immediately, then paste it in your
envfile asANTHROPIC_API_KEY=<your-api-key>
mcp dev servers/server.pyOR
npx -y @modelcontextprotocol/inspector
cd into root of this repo.
uv run servers/stdio.py
uv run servers/httpstreamable.py
cd into root of this repo.
uv run clients/stdio.py
uv run clients/httpstreamable.py
Note: for clients/httpstreamable.py, make sure the server is running before you run the client.
cd into root of this repo.
===========================
uv run chatbot/1-without-mcp/chatbot.py
Sample queries:
- Search for 2 papers on "LLM interpretability"
===========================
uv run chatbot/2-with-mcp-single-server/mcp_chatbot.py
Sample queries:
- Search for 2 papers on "LLM interpretability"
===========================
uv run chatbot/3-with-mcp-multi-servers/mcp_chatbot.py
Sample queries:
- Fetch the content of this website: https://modelcontextprotocol.io/docs/concepts/architecture and save the content in the file "mcp_summary.md", create a visual diagram that summarizes the content of "mcp_summary.md" and save it in a text file
- Fetch deeplearning.ai and find an interesting term. Search for 2 papers around the term and then summarize your findings and write them to a file called results.txt
===========================
uv run chatbot/4-with-mcp-add-reources-prompts/mcp_chatbot.py
Sample queries:
- @folders
- @ai_interpretability
- /prompts
- /prompt generate_search_prompt topic=history num_papers=2
===========================