-
Notifications
You must be signed in to change notification settings - Fork 1
Open
0 / 20 of 2 issues completedLabels
enhancementNew feature or requestNew feature or request
Description
Currently, the application supports Ollama backend to serve local LLMs. But the application should support other providers as well(e.g Gemini, Claude, GPT, etc). The user should be able to select the relevant LLM provider and use the chat as well as RAG functionalities with it. Possibly use LiteLLM as the proxy server
Sub-issues
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request