Skip to content

Support for multiple LLM providers #3

@debabratamishra

Description

@debabratamishra

Currently, the application supports Ollama backend to serve local LLMs. But the application should support other providers as well(e.g Gemini, Claude, GPT, etc). The user should be able to select the relevant LLM provider and use the chat as well as RAG functionalities with it. Possibly use LiteLLM as the proxy server

Sub-issues

Metadata

Metadata

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions