This guide shows you how to configure SmarterRouter to route to external cloud providers (OpenAI, Anthropic, Google, etc.) alongside your local Ollama models.
External provider support consists of two parts:
- provider.db - Benchmark database with scores for 400+ models
- External API Integration - Actually route requests to external providers
| Provider | Model Prefix | API Key | Base URL |
|---|---|---|---|
| OpenAI | openai/ |
OPENAI_API_KEY |
https://api.openai.com/v1 |
| Anthropic | anthropic/ |
ANTHROPIC_API_KEY |
https://api.anthropic.com/v1 |
google/ |
GOOGLE_API_KEY |
https://generativelanguage.googleapis.com/v1 |
|
| Cohere | cohere/ |
COHERE_API_KEY |
https://api.cohere.ai/v1 |
| Mistral | mistral/ |
MISTRAL_API_KEY |
https://api.mistral.ai/v1 |
# 1. Edit your .env file
nano .env
# Add these settings:
ROUTER_EXTERNAL_PROVIDERS_ENABLED=true
ROUTER_EXTERNAL_PROVIDERS=openai,anthropic,google
ROUTER_OPENAI_API_KEY=sk-...
ROUTER_ANTHROPIC_API_KEY=sk-ant-...
# 2. Restart SmarterRouter
docker-compose restart
# 3. Use external models
# In OpenWebUI, select models with prefix: openai/gpt-4o, anthropic/claude-3-opusAll configuration goes in your .env file:
# Enable external provider routing
ROUTER_EXTERNAL_PROVIDERS_ENABLED=true
# List of providers to use (comma-separated)
ROUTER_EXTERNAL_PROVIDERS=openai,anthropic,google
# API Keys (at least one required)
ROUTER_OPENAI_API_KEY=sk-...
ROUTER_ANTHROPIC_API_KEY=sk-ant-...
ROUTER_GOOGLE_API_KEY=...
ROUTER_COHERE_API_KEY=...
ROUTER_MISTRAL_API_KEY=...
# Optional: Custom base URLs (for proxies/self-hosted)
ROUTER_ANTHROPIC_BASE_URL=https://custom-endpoint.com
ROUTER_GOOGLE_BASE_URL=https://custom-endpoint.comNote: provider.db is used automatically for benchmark data. No additional configuration needed - it downloads to data/provider.db and updates every 4 hours.
Use provider prefixes to identify external models:
openai/gpt-4o- OpenAI's GPT-4oopenai/gpt-4-turbo- OpenAI's GPT-4 Turboanthropic/claude-3-opus- Anthropic's Claude 3 Opusanthropic/claude-3-sonnet- Anthropic's Claude 3 Sonnetgoogle/gemini-1.5-pro- Google's Gemini 1.5 Procohere/command-r-plus- Cohere's Command R+mistral/mistral-large- Mistral's Large model
Models are automatically discovered from provider.db (400+ models available).
- Benchmark Data - provider.db provides scores for reasoning, coding, and general knowledge
- Hybrid Selection - Router combines:
- Local model profiling (if model is also available locally)
- External benchmark scores from provider.db
- Model capabilities (vision, tool calling)
- External API Call - When an external model is selected, BackendRegistry routes to the appropriate provider
You can use both local Ollama and external providers simultaneously:
# Enable both
ROUTER_PROVIDER=ollama # Keep local backend
ROUTER_EXTERNAL_PROVIDERS_ENABLED=true
ROUTER_EXTERNAL_PROVIDERS=openai,anthropicIn your client, you can:
- Use
llama3(local),openai/gpt-4o(external),anthropic/claude-3-opus(external) - SmarterRouter will automatically pick the best model for each prompt
For setups that only use external providers (no local Ollama):
# docker-compose.external.yml
version: '3.8'
services:
smarterrouter:
image: smarterrouter:latest
container_name: smarterrouter
ports:
- "11436:11436"
environment:
- ROUTER_PROVIDER=ollama # Keep for compatibility, but won't connect
- ROUTER_OLLAMA_URL=http://localhost:11434 # Won't be used
- ROUTER_EXTERNAL_PROVIDERS_ENABLED=true
- ROUTER_EXTERNAL_PROVIDERS=openai,anthropic,google
- ROUTER_OPENAI_API_KEY=${OPENAI_API_KEY}
- ROUTER_ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- ROUTER_GOOGLE_API_KEY=${GOOGLE_API_KEY}
volumes:
- ./data:/app/hubrouter/data
- ./logs:/app/hubrouter/logs
restart: unless-stoppedRun with:
docker-compose -f docker-compose.external.yml up -dprovider.db is automatically updated every 4 hours by default. You can customize:
# Update frequency (hours, 0 = disabled)
ROUTER_PROVIDER_DB_AUTO_UPDATE_HOURS=4
# Download URL (advanced users)
ROUTER_PROVIDER_DB_DOWNLOAD_URL=https://raw.githubusercontent.com/peva3/smarterrouter-provider/refs/heads/main/data/provider.dbMake sure:
ROUTER_EXTERNAL_PROVIDERS_ENABLED=true- The provider (e.g.,
openai) is inROUTER_EXTERNAL_PROVIDERS - The corresponding API key is set
Check that provider.db is present:
ls -la data/provider.db
# Should see ~200KB fileIf missing, the system will fall back to local models only or keyword-based routing.
External API latency can be higher than local models. Consider:
- Increasing
ROUTER_GENERATION_TIMEOUT(default 120s) - Using faster models for simple tasks
- Keeping some local models as fallback
API Key Storage:
- Store API keys in
.envfile (not committed to git) - Use Docker secrets or vault solutions for production
- Set proper file permissions:
chmod 600 .env
Network Security:
- Use HTTPS endpoints (all default URLs are HTTPS)
- Consider using a proxy/VPN for additional privacy
- Monitor API usage for unexpected activity
Use with reverse proxies or self-hosted providers:
ROUTER_ANTHROPIC_BASE_URL=http://localhost:8080/v1
ROUTER_EXTERNAL_PROVIDERS=anthropic
ROUTER_ANTHROPIC_API_KEY=sk-ant-...The endpoint must be OpenAI-compatible (same API format).
If your external provider uses a different URL pattern, you can adjust the model prefix in router/backends/external.py. Most providers work with the defaults.
- Configuration Reference - All environment variables
- Backend Providers - Adding new backend types
- Performance Tuning - Optimize routing decisions