Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion src/praisonai-agents/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -84,4 +84,5 @@ build
*.mp4
*.png
graph.py
chroma_db/
chroma_db/
.qodo
83 changes: 83 additions & 0 deletions src/praisonai-agents/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
# MCP SSE Server and Client Implementation

This project demonstrates a working pattern for SSE-based MCP (Model Context Protocol) servers and clients. It consists of three main components:

1. **server.py**: An SSE-based MCP server that provides simple tools
2. **client.py**: A standalone client that connects to the server and uses its tools with Claude
3. **mcp-sse.py**: A client using praisonaiagents that connects to the server and uses its tools with OpenAI

## Tools Provided by the Server

The server implements two simple tools:

- **get_greeting**: Returns a personalized greeting for a given name
- **get_weather**: Returns simulated weather data for a given city

## Setup and Usage

### Prerequisites

Make sure you have the required packages installed:

```bash
pip install praisonaiagents mcp httpx starlette uvicorn anthropic python-dotenv
```

### Running the Server

First, start the MCP SSE server:

```bash
python server.py
```

By default, the server runs on 0.0.0.0:8080, but you can customize the host and port:

```bash
python server.py --host 127.0.0.1 --port 8081
```

### Running the Standalone Client

The standalone client uses Claude to interact with the MCP server tools:

```bash
# Set your Anthropic API key
export ANTHROPIC_API_KEY=your_api_key_here

# Run the client
python client.py http://0.0.0.0:8080/sse
```

You'll see a prompt where you can type queries for Claude to process using the MCP tools.

### Running the praisonaiagents Client

The praisonaiagents client uses OpenAI to interact with the MCP server tools:

```bash
# Set your OpenAI API key
export OPENAI_API_KEY=your_api_key_here

# Run the client
python mcp-sse.py
```

This will automatically send a query about the weather in Paris to the agent.

## How It Works

1. The server exposes MCP tools via an SSE endpoint
2. Clients connect to this endpoint and discover available tools
3. When a user makes a query, the client:
- For client.py: Uses Claude to determine which tool to call
- For mcp-sse.py: Uses OpenAI to determine which tool to call
4. The client executes the tool call against the server
5. The result is returned to the user

This pattern allows for decoupled processes where the MCP server can run independently of clients, making it suitable for cloud-native applications.

## Customizing

- To add more tools to the server, define new functions with the `@mcp.tool()` decorator in `server.py`
- To change the client's behavior, update the instructions and query in `mcp-sse.py`
1 change: 1 addition & 0 deletions src/praisonai-agents/mcp-sse-direct-agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

120 changes: 120 additions & 0 deletions src/praisonai-agents/mcp-sse-direct-client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
# python mcp-sse-direct-client.py http://0.0.0.0:8080/sse
import asyncio
import json
import os
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Remove unused import

The 'os' module is imported but never used in the code.

-import os

🏁 Script executed:

#!/bin/bash
# Check if the os import is actually used
grep -n "os\." src/praisonai-agents/mcp-sse-direct-client.py

Length of output: 62


🏁 Script executed:

#!/bin/bash
python3 - <<'EOF'
import ast, sys
path = 'src/praisonai-agents/mcp-sse-direct-client.py'
try:
    source = open(path).read()
except FileNotFoundError:
    print("FILE_NOT_FOUND")
    sys.exit(1)
tree = ast.parse(source)
class OsVisitor(ast.NodeVisitor):
    def __init__(self):
        self.used = False
    def visit_Name(self, node):
        if node.id == 'os':
            self.used = True
        self.generic_visit(node)
visitor = OsVisitor()
visitor.visit(tree)
print(visitor.used)
EOF

Length of output: 17


Remove unused os import

The os module is imported in src/praisonai-agents/mcp-sse-direct-client.py but never referenced. It can be safely removed.

• File: src/praisonai-agents/mcp-sse-direct-client.py

- import os
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import os
🧰 Tools
🪛 Ruff (0.8.2)

4-4: os imported but unused

Remove unused import: os

(F401)

import sys
from typing import Optional
from contextlib import AsyncExitStack

from mcp import ClientSession
from mcp.client.sse import sse_client

from dotenv import load_dotenv

load_dotenv() # load environment variables from .env

class MCPClient:
def __init__(self):
# Initialize session and client objects
self.session: Optional[ClientSession] = None
self.exit_stack = AsyncExitStack()

async def connect_to_sse_server(self, server_url: str):
"""Connect to an MCP server running with SSE transport"""
# Store the context managers so they stay alive
self._streams_context = sse_client(url=server_url)
streams = await self._streams_context.__aenter__()

self._session_context = ClientSession(*streams)
self.session: ClientSession = await self._session_context.__aenter__()

# Initialize
await self.session.initialize()

# List available tools to verify connection
print("Initialized SSE client...")
print("Listing tools...")
response = await self.session.list_tools()
tools = response.tools
print("\nConnected to server with tools:", [tool.name for tool in tools])

# Print tool descriptions
for tool in tools:
print(f"\n{tool.name}: {tool.description}")
if hasattr(tool, 'inputSchema') and tool.inputSchema:
print(f" Parameters: {json.dumps(tool.inputSchema, indent=2)}")

async def cleanup(self):
"""Properly clean up the session and streams"""
if hasattr(self, '_session_context'):
await self._session_context.__aexit__(None, None, None)
if hasattr(self, '_streams_context'):
await self._streams_context.__aexit__(None, None, None)

async def process_query(self, query: str) -> str:
"""Process a query by directly calling the appropriate tool"""
query = query.strip().lower()

if query.startswith("hello") or query.startswith("hi"):
# Extract name or use a default
parts = query.split()
name = parts[1] if len(parts) > 1 else "there"

# Call the greeting tool
print(f"\nCalling get_greeting with name: {name}")
result = await self.session.call_tool("get_greeting", {"name": name})
return result.content[0].text if hasattr(result, 'content') and result.content else str(result)

elif "weather" in query:
# Try to extract city name
city = None
for known_city in ["Paris", "London", "New York", "Tokyo", "Sydney"]:
if known_city.lower() in query.lower():
city = known_city
break

if not city:
return "I couldn't identify a city in your query. Please mention a city like Paris, London, New York, Tokyo, or Sydney."

# Call the weather tool
print(f"\nCalling get_weather with city: {city}")
result = await self.session.call_tool("get_weather", {"city": city})
return result.content[0].text if hasattr(result, 'content') and result.content else str(result)

else:
return "I can help with greetings or weather information. Try asking something like 'Hello John' or 'What's the weather in Paris?'"

async def chat_loop(self):
"""Run an interactive chat loop"""
print("\nMCP Client Started!")
print("Type your queries or 'quit' to exit.")

while True:
try:
query = input("\nQuery: ").strip()

if query.lower() == 'quit':
break

response = await self.process_query(query)
print("\n" + response)

except Exception as e:
print(f"\nError: {str(e)}")


async def main():
if len(sys.argv) < 2:
print("Usage: python client.py <URL of SSE MCP server (i.e. http://localhost:8081/sse)>")
sys.exit(1)

client = MCPClient()
try:
await client.connect_to_sse_server(server_url=sys.argv[1])
await client.chat_loop()
finally:
await client.cleanup()


if __name__ == "__main__":
asyncio.run(main())
95 changes: 95 additions & 0 deletions src/praisonai-agents/mcp-sse-direct-server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
# python mcp-sse-direct-server.py --host 127.0.0.1 --port 8080
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
from starlette.applications import Starlette
from mcp.server.sse import SseServerTransport
from starlette.requests import Request
from starlette.routing import Mount, Route
from mcp.server import Server
import uvicorn
import argparse
import logging
import os
import inspect

# Set up logging based on environment variable
log_level = os.environ.get("LOGLEVEL", "info").upper()
logging.basicConfig(level=getattr(logging, log_level))
logger = logging.getLogger("mcp-server")

# Initialize FastMCP server for simple tools (SSE)
mcp = FastMCP("simple-tools")

@mcp.tool()
async def get_greeting(name: str) -> str:
"""Get a personalized greeting.

Args:
name: Name of the person to greet
"""
logger.debug(f"get_greeting called with name: {name}")
return f"Hello, {name}! Welcome to our MCP SSE server."

@mcp.tool()
async def get_weather(city: str) -> str:
"""Get a simulated weather report for a city.

Args:
city: Name of the city
"""
logger.debug(f"get_weather called with city: {city}")
# This is a mock implementation
weather_data = {
"Paris": "Sunny with a temperature of 22°C",
"London": "Rainy with a temperature of 15°C",
"New York": "Cloudy with a temperature of 18°C",
"Tokyo": "Clear skies with a temperature of 25°C",
"Sydney": "Partly cloudy with a temperature of 20°C"
}

return weather_data.get(city, f"Weather data not available for {city}")

def create_starlette_app(mcp_server: Server, *, debug: bool = False) -> Starlette:
"""Create a Starlette application that can serve the provided mcp server with SSE."""
sse = SseServerTransport("/messages/")

async def handle_sse(request: Request) -> None:
logger.debug(f"SSE connection request received from {request.client}")
async with sse.connect_sse(
request.scope,
request.receive,
request._send, # noqa: SLF001
) as (read_stream, write_stream):
await mcp_server.run(
read_stream,
write_stream,
mcp_server.create_initialization_options(),
)

return Starlette(
debug=debug,
routes=[
Route("/sse", endpoint=handle_sse),
Mount("/messages/", app=sse.handle_post_message),
],
)

if __name__ == "__main__":
mcp_server = mcp._mcp_server # noqa: WPS437

parser = argparse.ArgumentParser(description='Run MCP SSE-based server')
parser.add_argument('--host', default='localhost', help='Host to bind to')
parser.add_argument('--port', type=int, default=8080, help='Port to listen on')
args = parser.parse_args()

print(f"Starting MCP SSE server on {args.host}:{args.port}")

# Hardcode the tool names since we know what they are
tool_names = ["get_greeting", "get_weather"]
print(f"Available tools: {', '.join(tool_names)}")

# Bind SSE request handling to MCP server
starlette_app = create_starlette_app(mcp_server, debug=True)

uvicorn.run(starlette_app, host=args.host, port=args.port)
9 changes: 9 additions & 0 deletions src/praisonai-agents/mcp-sse-weather.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
from praisonaiagents import Agent, MCP

search_agent = Agent(
instructions="""You are a weather agent that can provide weather information for a given city.""",
llm="openai/gpt-4o-mini",
tools=MCP("http://localhost:8080/sse")
)

search_agent.start("What is the weather in London?")
26 changes: 20 additions & 6 deletions src/praisonai-agents/praisonaiagents/agent/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -530,11 +530,21 @@ def execute_tool(self, function_name, arguments):
from ..mcp.mcp import MCP
if isinstance(self.tools, MCP):
logging.debug(f"Looking for MCP tool {function_name}")
# Check if any of the MCP tools match the function name
for mcp_tool in self.tools.runner.tools:
if hasattr(mcp_tool, 'name') and mcp_tool.name == function_name:
logging.debug(f"Found matching MCP tool: {function_name}")
return self.tools.runner.call_tool(function_name, arguments)

# Handle SSE MCP client
if hasattr(self.tools, 'is_sse') and self.tools.is_sse:
if hasattr(self.tools, 'sse_client'):
for tool in self.tools.sse_client.tools:
if tool.name == function_name:
logging.debug(f"Found matching SSE MCP tool: {function_name}")
return tool(**arguments)
# Handle stdio MCP client
elif hasattr(self.tools, 'runner'):
# Check if any of the MCP tools match the function name
for mcp_tool in self.tools.runner.tools:
if hasattr(mcp_tool, 'name') and mcp_tool.name == function_name:
logging.debug(f"Found matching MCP tool: {function_name}")
return self.tools.runner.call_tool(function_name, arguments)

# Try to find the function in the agent's tools list first
func = None
Expand Down Expand Up @@ -815,7 +825,11 @@ def chat(self, prompt, temperature=0.2, tools=None, output_json=None, output_pyd
logging.debug("Converting MCP tool to OpenAI format")
openai_tool = tool_param.to_openai_tool()
if openai_tool:
tool_param = [openai_tool]
# Handle both single tool and list of tools
if isinstance(openai_tool, list):
tool_param = openai_tool
else:
tool_param = [openai_tool]
logging.debug(f"Converted MCP tool: {tool_param}")

# Pass everything to LLM class
Expand Down
6 changes: 6 additions & 0 deletions src/praisonai-agents/praisonaiagents/llm/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -293,6 +293,12 @@ def get_response(
if isinstance(tool, dict) and 'type' in tool and tool['type'] == 'function':
logging.debug(f"Using pre-formatted OpenAI tool: {tool['function']['name']}")
formatted_tools.append(tool)
# Handle lists of tools (e.g. from MCP.to_openai_tool())
elif isinstance(tool, list):
for subtool in tool:
if isinstance(subtool, dict) and 'type' in subtool and subtool['type'] == 'function':
logging.debug(f"Using pre-formatted OpenAI tool from list: {subtool['function']['name']}")
formatted_tools.append(subtool)
elif callable(tool):
tool_def = self._generate_tool_definition(tool.__name__)
if tool_def:
Expand Down
3 changes: 3 additions & 0 deletions src/praisonai-agents/praisonaiagents/mcp/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
"""
Model Context Protocol (MCP) integration for PraisonAI Agents.

This package provides classes and utilities for connecting to MCP servers
using different transport methods (stdio, SSE, etc.).
"""
from .mcp import MCP

Expand Down
Loading
Loading