Skip to content

support user-provided AI model providers (OpenAI, Anthropic, Ollama) #51

@AjayBandiwaddar

Description

@AjayBandiwaddar

Problem

While setting up Sugar-AI locally, I noticed that app/ai.py only
supports HuggingFace models through the RAGAgent class. There's no
way for users to plug in their own OpenAI, Anthropic, or Ollama models.

This means if a school has an OpenAI API key and wants to use it with
Sugar-AI, they currently can't. Passing openai/gpt-4 to the
/change-model endpoint would crash the app since it tries to load
it as a HuggingFace model.

Proposed Solution

Add provider detection in app/ai.py, specifically in the __init__
and set_model methods of RAGAgent that checks the model string
prefix and routes to the correct client.

This way run and run_chat_completion would work the same way
regardless of provider.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions