Skip to content

Custom MCP server with client integration—plug your own tools into LangChain agents using FastMCP

License

Notifications You must be signed in to change notification settings

pk-pkulkarni/langchain_mcp

Repository files navigation

LangChain MCP – Restaurant Example

A demonstration of using Model Context Protocol (MCP) to expose a restaurant interaction tool that integrates with LangChain.


📁 Repo Structure

  • restaurant_mcp.py – MCP server exposing restaurant-related tools.
  • restaurant_data.py – Mock dataset (restaurant listings and menus).
  • mcp_client.py – Python client to connect and call MCP server tools.
  • requirements.txt – Python dependencies for server & client.
  • .env – Environment variables files, will have you OPENAI_API_KEY

🚀 Setup

  1. Clone and install dependencies

    git clone https://github.com/pk-pkulkarni/langchain_mcp.git
    cd langchain_mcp
    pip install -r requirements.txt
    
  2. Run the MCP server Open a terminal and start the server:

    python restaurant_mcp.py

⚠️ Make sure restaurant_mcp.py is running as a server in your terminal before proceeding.

  1. Run the client(For testing)
    In a separate terminal, run:
python mcp_client.py
  1. Run FastAPI Server
    Run File with uvicorn:
uvicorn mcp_client_api_chatbot:app --reload --port 8000
  1. Open chatbot.html
    Examples:
Give me veg menu
Give me all soups

About

Custom MCP server with client integration—plug your own tools into LangChain agents using FastMCP

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published