Skip to content

Doesn't work with godot mcp #15

@DocMAX

Description

@DocMAX
[docmax@x1cg9 ollama-mcp-bridge]$ npm run start

> [email protected] start
> ts-node src/main.ts

13:21:33 INFO:     LLMBridge - Starting main.ts...
13:21:33 INFO:     LLMBridge - Loaded bridge configuration from /opt/github/ollama-mcp-bridge/bridge_config.json
13:21:33 INFO:     LLMBridge - Initializing bridge with MCPs:
13:21:33 DEBUG:     LLMBridge - Initializing Ollama client with baseURL: http://127.0.0.1:11434
13:21:33 INFO:     LLMBridge - Connecting to MCP servers...
13:21:33 INFO:     LLMBridge - Connecting to MCP: primary
13:21:33 DEBUG:     LLMBridge - [MCP Client] Starting connection...
13:21:33 DEBUG:     LLMBridge - [MCP Client] Using working directory: /home/docmax/bridgeworkspace
13:21:33 DEBUG:     LLMBridge - [MCP Client] Spawning process: node /home/docmax/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js /home/docmax/bridgeworkspace
13:21:33 DEBUG:     LLMBridge - [MCP Client] Initializing session...
13:21:33 DEBUG:     LLMBridge - [MCP Client] Sending message: {"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"0.1.0","capabilities":{"tools":{"call":true,"list":true}},"clientInfo":{"name":"MCPLLMBridge","version":"1.0.0"}},"id":1}
13:21:33 ERROR:     LLMBridge - [MCP Client] Process error: spawn node ENOENT
13:21:33 ERROR:     LLMBridge - [MCP Client] Failed to send message: write EPIPE
13:21:33 ERROR:     LLMBridge - [MCP Client] Session initialization failed: write EPIPE
13:21:33 ERROR:     LLMBridge - [MCP Client] Connection failed: write EPIPE
13:21:33 ERROR:     LLMBridge - Bridge initialization failed: write EPIPE
13:21:33 ERROR:     LLMBridge - Fatal error: Failed to initialize bridge
13:21:33 INFO:     LLMBridge - Exiting process...

[docmax@x1cg9 ollama-mcp-bridge]$ cat /opt/github/ollama-mcp-bridge/bridge_config.json
{
  "mcpServers": {
    "godot-mcp": {
      "command": "/home/docmax/.volta/bin/node",
      "args": [
        "/opt/github/godot-mcp/server/dist/index.js"
      ],
      "env": {
        "MCP_TRANSPORT": "stdio"
      }
    }
  },
  "llm": {
    "model": "llama3.2",
    "baseUrl": "http://localhost:11434",
    "apiKey": "ollama",
    "temperature": 0.7,
    "maxTokens": 1000
  },
  "systemPrompt": "You are a helpful assistant that can use various tools to help answer questions. You have access to multiple MCPs including filesystem operations, GitHub interactions, Brave search."
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions