Skip to content

Fabric MCP Server: Seamlessly integrate Fabric AI capabilities into MCP-enabled tools like IDEs and chat interfaces.

License

Notifications You must be signed in to change notification settings

ksylvan/fabric-mcp

Repository files navigation

Fabric MCP Server

License: MIT image

main develop
Main Tests Main Publish Develop Tests Develop Publish
fabric-mcp logo

Connect the power of the Fabric AI framework to any Model Context Protocol (MCP) compatible application.

This project implements a standalone server that bridges the gap between Daniel Miessler's Fabric framework and the Model Context Protocol (MCP). It allows you to use Fabric's patterns, models, and configurations directly within MCP-enabled environments like IDE extensions or chat interfaces.

Imagine seamlessly using Fabric's specialized prompts for code explanation, refactoring, or creative writing right inside your favorite tools!

Table of Contents

What is this?

  • Fabric: An open-source framework for augmenting human capabilities using AI, focusing on prompt engineering and modular AI workflows.
  • MCP: An open standard protocol enabling AI applications (like IDEs) to securely interact with external tools and data sources (like this server).
  • Fabric MCP Server: This project acts as an MCP server, translating MCP requests into calls to a running Fabric instance's REST API (fabric --serve).

Key Goals & Features (Based on Design)

  • Seamless Integration: Use Fabric patterns and capabilities directly within MCP clients without switching context.
  • Enhanced Workflows: Empower LLMs within IDEs or other tools to leverage Fabric's specialized prompts and user configurations.
  • Standardization: Adhere to the open MCP standard for AI tool integration.
  • Leverage Fabric Core: Build upon the existing Fabric CLI and REST API without modifying the core Fabric codebase.
  • Expose Fabric Functionality: Provide MCP tools to list patterns, get pattern details, run patterns, list models/strategies, and retrieve configuration.

How it Works

  1. An MCP Host (e.g., an IDE extension) connects to this Fabric MCP Server.
  2. The Host discovers available tools (like fabric_run_pattern) via MCP's list_tools() mechanism.
  3. When the user invokes a tool (e.g., asking the IDE's AI assistant to refactor code using a Fabric pattern), the Host sends an MCP request to this server.
  4. The Fabric MCP Server translates the MCP request into a corresponding REST API call to a running fabric --serve instance.
  5. The fabric --serve instance executes the pattern.
  6. The Fabric MCP Server receives the response (potentially streaming) from Fabric and translates it back into an MCP response for the Host.

Project Status

This project is currently in the implementation phase.

The core architecture and proposed tools are outlined in the High-Level Design Document.

# you can also use pnpm if you prefer
npm install -g task-master-ai

And occasionally you should upgrade it:

# or use "pnpm upgrade -g task-master-ai"
npm upgrade -g task-master-ai

Read the Task Master docs for how to set up your .env file with the appropriate API keys.

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

  • Python >= 3.10
  • uv (Python package and environment manager) for developers

Installation From Source (for developers)

  1. Clone the repository:

    git clone https://github.com/ksylvan/fabric-mcp.git
    cd fabric-mcp
  2. Install dependencies using uv sync:

    uv sync --dev

    This command ensures your virtual environment matches the dependencies in pyproject.toml and uv.lock, creating the environment on the first run if necessary.

  3. Activate the virtual environment (uv will create it if needed):

    • On macOS/Linux:

      source .venv/bin/activate
    • On Windows:

      .venv\Scripts\activate

Now you have the development environment set up!

Installation From PyPI (for users)

If you just want to use the fabric-mcp server without developing it, you can install it directly from PyPI:

# Using pip
pip install fabric-mcp

# Or using uv
uv pip install fabric-mcp

This will install the package and its dependencies. You can then run the server using the fabric-mcp command.

Configuration (Environment Variables)

The fabric-mcp server can be configured using the following environment variables:

  • FABRIC_BASE_URL: The base URL of the running Fabric REST API server (fabric --serve).
    • Default: http://127.0.0.1:8080
  • FABRIC_API_KEY: The API key required to authenticate with the Fabric REST API server, if it's configured to require one.
    • Default: None (Authentication is not attempted if not set).
  • FABRIC_MCP_LOG_LEVEL: Sets the logging verbosity for the fabric-mcp server itself.
    • Options: DEBUG, INFO, WARNING, ERROR, CRITICAL (case-insensitive).
    • Default: INFO

You can set these variables in your shell environment (or put them into a .env file in the working directory) before running fabric-mcp:

export FABRIC_BASE_URL="http://your-fabric-host:port"
# This must match the key used by fabric --serve
export FABRIC_API_KEY="your_secret_api_key"
export FABRIC_MCP_LOG_LEVEL="DEBUG"

fabric-mcp --stdio

Contributing

Feedback on the design document is highly welcome! Please open an issue to share your thoughts or suggestions.

Read the contribution document here and please follow the guidelines for this repository.

Also refer to the cheat-sheet for contributors which contains a micro-summary of the development workflow.

License

Copyright (c) 2025, Kayvan Sylvan Licensed under the MIT License.

About

Fabric MCP Server: Seamlessly integrate Fabric AI capabilities into MCP-enabled tools like IDEs and chat interfaces.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published