I spun up a local contextforge container with environment variable LLMCHAT_ENABLED=true. I Then created an LLM Provider and Model in the UI. Then, I went to LLM Chat tab in the admin API and clicked "connect", which resulted in the error:
Service initialization failed: LLM chat dependencies are missing. Install them with: pip install '.[llmchat]'
This happens with both:
ghcr.io/ibm/mcp-context-forge:v1.0.0
and
ghcr.io/ibm/mcp-context-forge:latest
I spun up a local contextforge container with environment variable
LLMCHAT_ENABLED=true. I Then created an LLM Provider and Model in the UI. Then, I went to LLM Chat tab in the admin API and clicked "connect", which resulted in the error:This happens with both:
ghcr.io/ibm/mcp-context-forge:v1.0.0
and
ghcr.io/ibm/mcp-context-forge:latest