Skip to content

Service initialization failed: LLM chat dependencies are missing #4659

@bbeasley-zscaler

Description

@bbeasley-zscaler

I spun up a local contextforge container with environment variable LLMCHAT_ENABLED=true. I Then created an LLM Provider and Model in the UI. Then, I went to LLM Chat tab in the admin API and clicked "connect", which resulted in the error:

Service initialization failed: LLM chat dependencies are missing. Install them with: pip install '.[llmchat]'

Image

This happens with both:

ghcr.io/ibm/mcp-context-forge:v1.0.0

and

ghcr.io/ibm/mcp-context-forge:latest

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions