A modern MCP (Model Control Protocol) project built with Python, FastMCP, FastAPI, Docker, and integrated with Ollama and Open-webUI.
This project provides a lightweight, extensible foundation for building and deploying intelligent systems that manage and expose AI/LLM capabilities.
- 🐍 Python – Primary language
- ⚡ FastMCP – Model Control Protocol framework for managing AI models
- 🌐 FastAPI – High-performance API backend
- 🧠 Ollama – LLM execution and orchestration
- 🧩 Open-webUI – Chat-style interface for AI interactions
- 🐳 Docker – Containerized for easy deployment and reproducibility in dev and production environments.
- 🚀 Fast startup with Docker
- 🔌 Easy integration with Ollama and Open-webUI
- 📦 Pluggable architecture for adding models and routes
- 🎯 Designed for rapid prototyping or production use
- ✅ REST API ready with OpenAPI docs
git clone https://github.com/rainer85ah/mcp-server.git
cd mcp-serverdocker compose up --build -dOllama: http://localhost:11434
API Docs: http://localhost:8000/docs
OpenAPI: http://localhost:8000/openapi.json
MCP Server: http://localhost:8000/service/mcp/
Open-webUI: http://localhost:3000Use this project as a starter template for:
- AI chat platforms
- Model routing gateways
- Developer LLM sandboxes
- FastAPI-based ML backends
This project is licensed under the MIT License.
You are free to use, modify, and distribute this software with proper attribution.
Contributions are welcome! Feel free to:
- ⭐ Star the project
- 🍴 Fork the repo
- 🛠️ Open issues or feature requests
- 🔁 Submit pull requests
Created with 💡 by Rainer Arencibia
🔗 Connect with me on LinkedIn
