Minimal multimodal AI chat app with dynamic conversation routing
β Supported Models Provider:
- DeepSeek
- OpenAI
- Anthropic
- Google Gemini
- Local Models via Ollama (Llama3, Phi-3, Mistral, etc.)
- Cohere
- OpenRouter
β¨ Core Capabilities:
- Dynamic conversation routing with SemanticRouter
- Multi-modal interactions (Text/Image/Audio)
- Assistant framework with code interpretation
- Real-time response streaming
- Cross-provider model switching
- Local model support with Ollama integration
- Python 3.11+ (specified in
.python-version
) - uv (for managing virtual environments and dependencies)
- Ollama (for local models, optional)
pip install uv
git clone https://github.com/vinhnx/VT.ai.git
cd VT.ai
uv venv
source .venv/bin/activate # Linux/Mac
# .venv\Scripts\activate # Windows
uv pip install -r requirements.txt
cp .env.example .env
Note: uv
provides a fast and efficient way to manage Python environments and dependencies. Activate the virtual environment before running the app.
Populate the .env
file with your API keys. Depending on the models and features you want to use, set the corresponding keys. For example:
OPENAI_API_KEY
: Required for OpenAI models, assistant mode, TTS, and image generation.GEMINI_API_KEY
: Required for Google Gemini models and vision capabilities.ANTHROPIC_API_KEY
: Required for Anthropic models.GROQ_API_KEY
: Required for Groq models.COHERE_API_KEY
: Required for Cohere models.OPENROUTER_API_KEY
: Required for OpenRouter models.MISTRAL_API_KEY
: Required for Mistral models.HUGGINGFACE_API_KEY
: Required for Hugging Face models (if applicable).
Refer to .env.example
for the full list. Example:
OPENAI_API_KEY=sk-your-key
GEMINI_API_KEY=your-gemini-key
COHERE_API_KEY=your-cohere-key
ANTHROPIC_API_KEY=your-claude-key
HUGGINGFACE_API_KEY=your-huggingface-key
GROQ_API_KEY=your-groq-key
OPENROUTER_API_KEY=your-openrouter-key
MISTRAL_API_KEY=your-mistral-key
# For local models via Ollama
OLLAMA_HOST=http://localhost:11434
Note: If OPENAI_API_KEY
or GEMINI_API_KEY
is missing, the app will prompt you to enter them at startup.
source .venv/bin/activate # Linux/Mac
# .venv\Scripts\activate # Windows
python src/router/trainer.py
chainlit run src/app.py -w
Note: Training the semantic router is optional. Use the provided layers.json
for default routing or train your own for better performance/customization. Training requires an OpenAI API key.
Shortcut | Action |
---|---|
Ctrl+/ | Switch model provider |
Ctrl+, | Open settings |
Ctrl+L | Clear conversation history |
- Multi-LLM conversations
- Dynamic model switching
- Image generation & analysis
- Audio transcription
async def solve_math_problem(problem: str):
assistant = MinoAssistant()
return await assistant.solve(problem)
- Code interpreter for complex calculations
- File attachments (PDF/CSV/Images)
- Persistent conversation threads
- Custom tool integrations
VT.ai/
βββ src/
β βββ assistants/ # Custom AI assistant implementations
β βββ router/ # Semantic routing configuration
β βββ utils/ # Helper functions & configs
β βββ app.py # Main application entrypoint
βββ public/ # Static assets
βββ requirements.txt # Python dependencies
βββ .env.example # Environment template
Category | Models |
---|---|
Chat | GPT-4o, Claude 3.5, Gemini 1.5, Llama3-70B, Mixtral 8x7B |
Vision | GPT-4o, Gemini 1.5 Pro, Llama3.2 Vision |
Image Gen | DALL-E 3 |
TTS | OpenAI TTS-1, TTS-1-HD |
Local | Llama3, Phi-3, Mistral, Deepseek R1 series |
# Activate the virtual environment
source .venv/bin/activate # Linux/Mac
# .venv\Scripts\activate # Windows
uv pip install pytest black
# Run tests (once tests are added)
pytest tests/
# Format code
black .
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Add type hints for new functions
- Update documentation
- Open a Pull Request
Note: Development dependencies (e.g., pytest
, black
) are not currently specified in a separate file. Install them manually with uv pip install
as needed.
MIT License - See LICENSE for full text.
- Inspired by Chainlit for chat interface
- Powered by LiteLLM for model abstraction
- Semantic routing via SemanticRouter