-
Notifications
You must be signed in to change notification settings - Fork 151
Open
Description
This issue tracks the integration of LiteLLM to provide unified access to multiple LLM providers through a single, consistent interface. This will replace custom provider implementations and enable support for 100+ LLM models across various providers.
Motivation
- Complexity: Multiple custom provider implementations (OllamaProvider, GeminiProvider) increase maintenance burden
- Limited flexibility: Adding new providers requires custom implementation
- Inconsistent API: Different providers have different interfaces
- Solution: LiteLLM provides a unified interface for all major LLM providers
Expected Changes
Core Implementation
- New
LiteLLMProviderClass inmodels.py - Unified completion API for all providers
- Automatic response format conversion
- Support for streaming, JSON mode, temperature, top_p
- Configurable api_base for local providers
Enhanced Provider Initialization
- LiteLLM enabled by default
- Automatic Ollama model prefixing (
ollama/{model}), just need to provide ollama as the provider and ollama model as the default model in env file. - Backward compatible with legacy providers
- Smart provider detection
Extended Provider Support
- Add
DEEPSEEK,OPENAI,ANTHROPICto ModelProvider enum - Add 15+ new model configurations
- Environment variable support for API keys
- Configurable Ollama API base URL
🚀 Supported Providers
| Provider | Example Models | Environment Variable | Cost |
|---|---|---|---|
| Ollama | gemma3:4b, mistral:7b, qwen3:4b |
None | Free (local) |
| DeepSeek | deepseek-chat, deepseek-coder |
DEEPSEEK_API_KEY |
Paid |
| Google Gemini | gemini-2.5-flash, gemini-2.5-pro |
GEMINI_API_KEY |
Paid |
| OpenAI | gpt-4, gpt-4-turbo, gpt-3.5-turbo |
OPENAI_API_KEY |
Paid |
| Anthropic | claude-3-opus, claude-3-sonnet |
ANTHROPIC_API_KEY |
Paid |
📦 Dependencies
pip install litellmMetadata
Metadata
Assignees
Labels
No labels