Skip to content

Commit 1898937

Browse files
Merge pull request #456 from MervinPraison/develop
Increment version to 0.0.77 in `pyproject.toml` and update `markitdow…
2 parents 8fb8c21 + dacaa7f commit 1898937

File tree

15 files changed

+1080
-46
lines changed

15 files changed

+1080
-46
lines changed

docs/mcp/sse.mdx

Lines changed: 248 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,248 @@
1+
---
2+
title: "MCP SSE Integration"
3+
sidebarTitle: "MCP SSE"
4+
description: "Guide for integrating Server-Sent Events (SSE) with PraisonAI agents using MCP"
5+
icon: "server"
6+
---
7+
8+
## Add SSE Tool to AI Agent
9+
10+
```mermaid
11+
flowchart LR
12+
In[In] --> Agent[AI Agent]
13+
Agent --> Tool[SSE MCP]
14+
Tool --> Agent
15+
Agent --> Out[Out]
16+
17+
style In fill:#8B0000,color:#fff
18+
style Agent fill:#2E8B57,color:#fff
19+
style Tool fill:#4169E1,color:#fff
20+
style Out fill:#8B0000,color:#fff
21+
```
22+
23+
## Quick Start
24+
25+
<Steps>
26+
<Step title="Create a client file">
27+
28+
```python
29+
from praisonaiagents import Agent, MCP
30+
31+
search_agent = Agent(
32+
instructions="""You are a weather agent that can provide weather information for a given city.""",
33+
llm="gpt-4o-mini",
34+
tools=MCP("http://localhost:8080/sse")
35+
)
36+
37+
search_agent.start("What is the weather in London?")
38+
```
39+
40+
</Step>
41+
<Step title="Set Up SSE MCP Server">
42+
43+
```python
44+
# python mcp-sse-direct-server.py --host 127.0.0.1 --port 8080
45+
from typing import Any
46+
import httpx
47+
from mcp.server.fastmcp import FastMCP
48+
from starlette.applications import Starlette
49+
from mcp.server.sse import SseServerTransport
50+
from starlette.requests import Request
51+
from starlette.routing import Mount, Route
52+
from mcp.server import Server
53+
import uvicorn
54+
import argparse
55+
import logging
56+
import os
57+
import inspect
58+
59+
# Set up logging based on environment variable
60+
log_level = os.environ.get("LOGLEVEL", "info").upper()
61+
logging.basicConfig(level=getattr(logging, log_level))
62+
logger = logging.getLogger("mcp-server")
63+
64+
# Initialize FastMCP server for simple tools (SSE)
65+
mcp = FastMCP("simple-tools")
66+
67+
@mcp.tool()
68+
async def get_greeting(name: str) -> str:
69+
"""Get a personalized greeting.
70+
71+
Args:
72+
name: Name of the person to greet
73+
"""
74+
logger.debug(f"get_greeting called with name: {name}")
75+
return f"Hello, {name}! Welcome to our MCP SSE server."
76+
77+
@mcp.tool()
78+
async def get_weather(city: str) -> str:
79+
"""Get a simulated weather report for a city.
80+
81+
Args:
82+
city: Name of the city
83+
"""
84+
logger.debug(f"get_weather called with city: {city}")
85+
# This is a mock implementation
86+
weather_data = {
87+
"Paris": "Sunny with a temperature of 22°C",
88+
"London": "Rainy with a temperature of 15°C",
89+
"New York": "Cloudy with a temperature of 18°C",
90+
"Tokyo": "Clear skies with a temperature of 25°C",
91+
"Sydney": "Partly cloudy with a temperature of 20°C"
92+
}
93+
94+
return weather_data.get(city, f"Weather data not available for {city}")
95+
96+
def create_starlette_app(mcp_server: Server, *, debug: bool = False) -> Starlette:
97+
"""Create a Starlette application that can serve the provided mcp server with SSE."""
98+
sse = SseServerTransport("/messages/")
99+
100+
async def handle_sse(request: Request) -> None:
101+
logger.debug(f"SSE connection request received from {request.client}")
102+
async with sse.connect_sse(
103+
request.scope,
104+
request.receive,
105+
request._send, # noqa: SLF001
106+
) as (read_stream, write_stream):
107+
await mcp_server.run(
108+
read_stream,
109+
write_stream,
110+
mcp_server.create_initialization_options(),
111+
)
112+
113+
return Starlette(
114+
debug=debug,
115+
routes=[
116+
Route("/sse", endpoint=handle_sse),
117+
Mount("/messages/", app=sse.handle_post_message),
118+
],
119+
)
120+
121+
if __name__ == "__main__":
122+
mcp_server = mcp._mcp_server # noqa: WPS437
123+
124+
parser = argparse.ArgumentParser(description='Run MCP SSE-based server')
125+
parser.add_argument('--host', default='localhost', help='Host to bind to')
126+
parser.add_argument('--port', type=int, default=8080, help='Port to listen on')
127+
args = parser.parse_args()
128+
129+
print(f"Starting MCP SSE server on {args.host}:{args.port}")
130+
131+
# Hardcode the tool names since we know what they are
132+
tool_names = ["get_greeting", "get_weather"]
133+
print(f"Available tools: {', '.join(tool_names)}")
134+
135+
# Bind SSE request handling to MCP server
136+
starlette_app = create_starlette_app(mcp_server, debug=True)
137+
138+
uvicorn.run(starlette_app, host=args.host, port=args.port)
139+
```
140+
</Step>
141+
142+
<Step title="Install Dependencies">
143+
Make sure you have the required packages installed:
144+
```bash
145+
pip install "praisonaiagents[llm]" mcp starlette uvicorn httpx
146+
```
147+
</Step>
148+
<Step title="Export API Key">
149+
```bash
150+
export OPENAI_API_KEY="your_api_key"
151+
```
152+
</Step>
153+
154+
<Step title="Run the Server and Agent">
155+
First, start the SSE server:
156+
```bash
157+
python mcp-sse-direct-server.py --host 127.0.0.1 --port 8080
158+
```
159+
160+
Then, in a new terminal, run the agent:
161+
```bash
162+
python weather_agent.py
163+
```
164+
</Step>
165+
</Steps>
166+
167+
<Note>
168+
**Requirements**
169+
- Python 3.10 or higher
170+
- MCP server dependencies
171+
</Note>
172+
173+
## Alternative LLM Integrations
174+
175+
### Using Groq with SSE
176+
177+
```python
178+
from praisonaiagents import Agent, MCP
179+
180+
weather_agent = Agent(
181+
instructions="""You are a weather agent that can provide weather information for a given city.""",
182+
llm="groq/llama-3.2-90b-vision-preview",
183+
tools=MCP("http://localhost:8080/sse")
184+
)
185+
186+
weather_agent.start("What is the weather in London?")
187+
```
188+
189+
### Using Ollama with SSE
190+
191+
```python
192+
from praisonaiagents import Agent, MCP
193+
194+
weather_agent = Agent(
195+
instructions="""You are a weather agent that can provide weather information for a given city.""",
196+
llm="ollama/llama3.2",
197+
tools=MCP("http://localhost:8080/sse")
198+
)
199+
200+
weather_agent.start("What is the weather in London? Use get_weather tool, city is the required parameter.")
201+
```
202+
203+
## Gradio UI Integration
204+
205+
Create a Gradio UI for your weather service:
206+
207+
```python
208+
from praisonaiagents import Agent, MCP
209+
import gradio as gr
210+
211+
def get_weather_info(query):
212+
weather_agent = Agent(
213+
instructions="""You are a weather agent that can provide weather information for a given city.""",
214+
llm="gpt-4o-mini",
215+
tools=MCP("http://localhost:8080/sse")
216+
)
217+
218+
result = weather_agent.start(query)
219+
return f"## Weather Information\n\n{result}"
220+
221+
demo = gr.Interface(
222+
fn=get_weather_info,
223+
inputs=gr.Textbox(placeholder="What's the weather in London?"),
224+
outputs=gr.Markdown(),
225+
title="Weather MCP Agent",
226+
description="Ask about the weather in any major city:"
227+
)
228+
229+
if __name__ == "__main__":
230+
demo.launch()
231+
```
232+
233+
## Features
234+
235+
<CardGroup cols={2}>
236+
<Card title="Real-time Updates" icon="bolt">
237+
Receive server-sent events in real-time from your AI agent.
238+
</Card>
239+
<Card title="Multi-Agent Support" icon="users">
240+
Combine SSE with other MCP tools for complex workflows.
241+
</Card>
242+
<Card title="Multiple LLM Options" icon="brain">
243+
Use with OpenAI, Groq, Ollama, or other supported LLMs.
244+
</Card>
245+
<Card title="Gradio UI" icon="window">
246+
Create user-friendly interfaces for your SSE integrations.
247+
</Card>
248+
</CardGroup>

docs/mint.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -238,6 +238,7 @@
238238
"group": "MCP",
239239
"pages": [
240240
"mcp/airbnb",
241+
"mcp/sse",
241242
"mcp/ollama",
242243
"mcp/groq",
243244
"mcp/openrouter",

src/praisonai-agents/.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,4 +84,5 @@ build
8484
*.mp4
8585
*.png
8686
graph.py
87-
chroma_db/
87+
chroma_db/
88+
.qodo

src/praisonai-agents/README.md

Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
# MCP SSE Server and Client Implementation
2+
3+
This project demonstrates a working pattern for SSE-based MCP (Model Context Protocol) servers and clients. It consists of three main components:
4+
5+
1. **server.py**: An SSE-based MCP server that provides simple tools
6+
2. **client.py**: A standalone client that connects to the server and uses its tools with Claude
7+
3. **mcp-sse.py**: A client using praisonaiagents that connects to the server and uses its tools with OpenAI
8+
9+
## Tools Provided by the Server
10+
11+
The server implements two simple tools:
12+
13+
- **get_greeting**: Returns a personalized greeting for a given name
14+
- **get_weather**: Returns simulated weather data for a given city
15+
16+
## Setup and Usage
17+
18+
### Prerequisites
19+
20+
Make sure you have the required packages installed:
21+
22+
```bash
23+
pip install praisonaiagents mcp httpx starlette uvicorn anthropic python-dotenv
24+
```
25+
26+
### Running the Server
27+
28+
First, start the MCP SSE server:
29+
30+
```bash
31+
python server.py
32+
```
33+
34+
By default, the server runs on 0.0.0.0:8080, but you can customize the host and port:
35+
36+
```bash
37+
python server.py --host 127.0.0.1 --port 8081
38+
```
39+
40+
### Running the Standalone Client
41+
42+
The standalone client uses Claude to interact with the MCP server tools:
43+
44+
```bash
45+
# Set your Anthropic API key
46+
export ANTHROPIC_API_KEY=your_api_key_here
47+
48+
# Run the client
49+
python client.py http://0.0.0.0:8080/sse
50+
```
51+
52+
You'll see a prompt where you can type queries for Claude to process using the MCP tools.
53+
54+
### Running the praisonaiagents Client
55+
56+
The praisonaiagents client uses OpenAI to interact with the MCP server tools:
57+
58+
```bash
59+
# Set your OpenAI API key
60+
export OPENAI_API_KEY=your_api_key_here
61+
62+
# Run the client
63+
python mcp-sse.py
64+
```
65+
66+
This will automatically send a query about the weather in Paris to the agent.
67+
68+
## How It Works
69+
70+
1. The server exposes MCP tools via an SSE endpoint
71+
2. Clients connect to this endpoint and discover available tools
72+
3. When a user makes a query, the client:
73+
- For client.py: Uses Claude to determine which tool to call
74+
- For mcp-sse.py: Uses OpenAI to determine which tool to call
75+
4. The client executes the tool call against the server
76+
5. The result is returned to the user
77+
78+
This pattern allows for decoupled processes where the MCP server can run independently of clients, making it suitable for cloud-native applications.
79+
80+
## Customizing
81+
82+
- To add more tools to the server, define new functions with the `@mcp.tool()` decorator in `server.py`
83+
- To change the client's behavior, update the instructions and query in `mcp-sse.py`

0 commit comments

Comments
 (0)