-
Notifications
You must be signed in to change notification settings - Fork 111
Description
The application fails to run inside a standard node:20-bookworm-slim Docker container.
The logs show that it finds and loads bridge_config.json, but then ignores its contents and uses default values. This triggers a fallback mechanism that crashes with a [MCP Client] Process error: spawn node ENOENT error.
This suggests the application does not correctly pass the PATH environment variable to child processes it spawns, making it impossible for them to find the node executable. We have exhaustively tried to work around this by fixing the PATH and creating symlinks, but the error persists.
docker run --rm -it ollama-mcp-bridge 15:26:59 INFO: LLMBridge - Starting main.ts... 15:26:59 INFO: LLMBridge - Loaded bridge configuration from /app/bridge_config.json 15:26:59 INFO: LLMBridge - Initializing bridge with MCPs: 15:26:59 DEBUG: LLMBridge - Initializing Ollama client with baseURL: http://127.0.0.1:11434/v1 15:26:59 INFO: LLMBridge - Connecting to MCP servers... 15:26:59 INFO: LLMBridge - Connecting to MCP: primary 15:26:59 DEBUG: LLMBridge - [MCP Client] Starting connection... 15:26:59 DEBUG: LLMBridge - [MCP Client] Using working directory: /root/bridgeworkspace 15:26:59 DEBUG: LLMBridge - [MCP Client] Spawning process: node /root/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js /root/bridgeworkspace 15:26:59 DEBUG: LLMBridge - [MCP Client] Initializing session... 15:26:59 DEBUG: LLMBridge - [MCP Client] Sending message: {"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"0.1.0","capabilities":{"tools":{"call":true,"list":true}},"clientInfo":{"name":"MCPLLMBridge","version":"1.0.0"}},"id":1} 15:26:59 ERROR: LLMBridge - [MCP Client] Process error: spawn node ENOENT 15:26:59 ERROR: LLMBridge - [MCP Client] Failed to send message: write EPIPE 15:26:59 ERROR: LLMBridge - [MCP Client] Session initialization failed: write EPIPE 15:26:59 ERROR: LLMBridge - [MCP Client] Connection failed: write EPIPE 15:26:59 ERROR: LLMBridge - Bridge initialization failed: write EPIPE 15:26:59 ERROR: LLMBridge - Fatal error: Failed to initialize bridge 15:26:59 INFO: LLMBridge - Exiting process...
FROM node:20-bookworm-slim
# Set the application directory
WORKDIR /app
# Copy application source code
COPY ollama-mcp-bridge/ .
# --- FIX FOR THE ENOENT ERROR ---
# Create a symbolic link for node in a standard system path.
# This helps child processes find 'node' even if the PATH is not inherited correctly.
RUN ln -s /usr/local/bin/node /usr/bin/node
# Create the configuration file. The app will still ignore this,
# but the symlink above should fix the resulting crash.
RUN echo '{\
"ollama_model": "llama3",\
"ollama_host": "http://host.docker.internal:11434",\
"mcp_servers": [\
{\
"name": "primary",\
"command": "/app/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",\
"exec": "/usr/local/bin/node"\
}\
]\
}' > bridge_config.json
# Install dependencies and build
RUN corepack enable && npm ci && npm run build
RUN npm install @modelcontextprotocol/server-filesystem
# Run the application
CMD ["node", "dist/main.js"]```