A comprehensive Model Context Protocol (MCP) server that provides tools for interacting with Comet ML API. This server enables seamless integration with Comet ML's experiment tracking platform through a standardized protocol.
- 🔧 MCP Server: Full Model Context Protocol implementation for tool integration
 - 📊 Experiment Management: List, search, and analyze experiments with detailed metrics
 - 📁 Project Management: Organize and explore projects across workspaces
 - 🔍 Advanced Search: Search experiments by name, description, and project
 - 📈 Session Management: Singleton 
comet_ml.API()instance with robust error handling 
- Python 3.8 or higher
 - Comet ML account and API key
 
pip install comet-mcp --upgradeYou can run the Comet MCP server using Docker to avoid installing Python dependencies on your system.
- 
Build the Docker image:
docker build -t comet-mcp . - 
Configure your MCP client (see Usage section below for configuration examples)
 
The server uses standard comet_ml configuration:
- Using 
comet init; or - Using environment variables
 
Example:
export COMET_API_KEY=your_comet_api_key_here
# Optional: Set default workspace (if not provided, uses your default)
export COMET_WORKSPACE=your_workspace_namelist_experiments(workspace, project_name)- List recent experiments with optional filteringget_experiment_details(experiment_id)- Get comprehensive experiment information including metrics and parametersget_experiment_code(experiment_id)- Retrieve source code from experimentsget_experiment_metric_data(experiment_ids, metric_names, x_axis)- Get metric data for multiple experimentsget_default_workspace()- Get the default workspace name for the current userlist_projects(workspace)- List all projects in a workspacelist_project_experiments(project_name, workspace)- List experiments within a specific projectcount_project_experiments(project_name, workspace)- Count and analyze experiments in a projectget_session_info()- Get current session status and connection information
- Structured Data: All tools return properly typed data structures
 - Error Handling: Graceful handling of API failures and missing data
 - Flexible Filtering: Filter by workspace, project, or search terms
 - Rich Metadata: Includes timestamps, descriptions, and status information
 
Run the server to provide tools to MCP clients:
# Start the MCP server
comet-mcpThe server will:
- Initialize Comet ML session
 - Register all available tools
 - Listen for MCP client connections via stdio
 
Create a configuration for your AI system. For example:
Local Installation:
{
  "servers": [
    {
      "name": "comet-mcp",
      "description": "Comet ML MCP server for experiment management",
      "command": "comet-mcp",
      "env": {
        "COMET_API_KEY": "${COMET_API_KEY}"
      }
    }
  ]
}Docker Installation (Alternative):
{
  "mcpServers": {
    "comet-mcp": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "COMET_API_KEY",
        "-e",
        "COMET_WORKSPACE",
        "comet-mcp",
        "comet-mcp",
        "--transport",
        "stdio"
      ],
      "env": {
        "COMET_API_KEY": "your_api_key_here",
        "COMET_WORKSPACE": "your_workspace_name"
      }
    }
  }
}comet-mcp supports "stdio" and "sse" transport modes.
usage: comet-mcp [-h] [--transport {stdio,sse}] [--host HOST] [--port PORT]
Comet ML MCP Server
options:
  -h, --help            show this help message and exit
  --transport {stdio,sse}
                        Transport method to use (default: stdio)
  --host HOST           Host for SSE transport (default: localhost)
  --port PORT           Port for SSE transport (default: 8000)
For complete details on testing this (or any MCP server) see examples/README.
This project is licensed under the MIT License - see the LICENSE file for details.
- Documentation: GitHub Repository
 - Issues: GitHub Issues
 - Comet ML: Comet ML Documentation