Vermeil is a lightweight, powerful, and cost-effective AI assistant designed to run efficiently on your local machine. At its core is Vermeil, a customized Large Language Model (LLM) based on Microsoft's phi3:mini, fine-tuned for high-quality, responsive interaction.
The project's main goal is to make advanced, personalized AI accessible to everyone without the high costs of proprietary cloud services. With a focus on modularity, Vermeil can be extended using the Model Context Protocol (MCP), allowing you to integrate your own data and tools seamlessly.
- 🧠 Custom LLM: Powered by a fine-tuned
phi3:minimodel for optimal performance and quality. - 💸 Low-Cost: Engineered to minimize resource consumption, making it perfect for local deployment.
- 🔧 Extensible Architecture: Easily add new capabilities and context using the Model Context Protocol (MCP).
- 🐍 Python-Powered: Built with Python 3.10 for modern, robust performance.
- 🔜 Cross-Platform Vision: Currently available for desktop, with mobile and IoT versions on the horizon!
Vermeil is currently under heavy development. Features are being added and improved continuously. While the core functionality is operational, you may encounter bugs. We appreciate your patience and welcome contributions to help us build a stable and feature-rich application.
Follow these steps to get Vermeil running on your local machine.
- Ollama: Vermeil uses Ollama to run the local LLM.
- Python 3.10: This project is built and tested with Python 3.10. It is highly recommended.
- Git: To clone the repository.
-
Install Ollama:
- For macOS & Linux:
curl -fsSL https://ollama.com/install.sh | sh - For Windows: Download and run the installer from the Ollama website.
- For macOS & Linux:
-
Download the Base Model: After installing Ollama, open your terminal and pull the
phi3:minimodel. This is the foundation for Vermeil.ollama pull phi3:mini
-
Create the Custom 'vermeil' Model: Vermeil is designed to work with a custom model entry. In the root of this project, you will find a
Modelfile(click to see example). Use it to create thevermeilmodel by running:ollama create vermeil -f Modelfile
(You can skip this step)
-
Clone the repository:
git clone https://github.com/Matrixxboy/vermeil.git cd vermeil -
Set Model Configuration: Open the configuration file located at
core_project/core/Model/model_config.py. Ensure the following variables are set to point to your local Ollama instance and the custom model:OLLAMA_URL = "http://localhost:11434/api/generate" #you can use default model : phi3:mini OLLAMA_MODEL = "vermeil" #or "phi3:mini"
-
Create a Virtual Environment & Install Dependencies:
python3.10 -m venv venv source venv/bin/activate # On Windows, use `venv\Scripts\activate` pip install -r requirements.txt
Once Ollama is running and the project is configured, launch the application:
python main.pyOne of Vermeil's most powerful features is its extensibility.
MCP allows you to inject custom context, tools, or data into the model before it processes a prompt.
This enables you to tailor Vermeil's knowledge and capabilities to your specific needs.
- Navigate to the
core_project/MCP/directory. - Add your custom protocol files or modules in this directory.
- The application will automatically detect and load them, making them available to the LLM.
This system is designed to be simple yet powerful, allowing for deep customization without altering the core codebase.
We have an exciting future planned for Vermeil! Here's what we're working on:
- Core AI Assistant (Desktop)
- Fully-featured Mobile Application
- IoT Device Integration for smart home commands
- Advanced Tool Integration via MCP
- Community-driven MCP library
- Further performance and cost-optimization research
Contributions are what make the open-source community such an amazing place to learn, inspire, and create.
Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better:
-
Fork the Project
-
Create your Feature Branch
git checkout -b feature/AmazingFeature
-
Commit your Changes
git commit -m 'Add some AmazingFeature' -
Push to the Branch
git push origin feature/AmazingFeature
Open a Pull Request
Distributed under the MIT License. See LICENSE for more information.
Stay tuned for more updates! 🚀