This project is still in an early development phase. Everything is subject to change at any time, and there is no guarantee of stability or backward compatibility. Do not use it in production environments.
We welcome feedback and contributions, but please be aware that structures, APIs, and implementations may change significantly.
Currently supported are:
- Mistral (ai_engine.name: "MistralAI")
- OpenAI (and all OpenAI-compatible providers, such as or Open-WebUI) (ai_engine.name: "OpenAI")
- Ollama (ai_endine.name: "OllamaAI")
- Anthropic (ai_engine.name: "AnthropicAI")
- (WIP: Support for DeepL (translation only)
Minimal configuration:
'itomig-ai-base' => array (
'ai_engine.configuration' => array (
'api_key' => '***',
'url' => 'https://api.mistral.ai/v1/',
'model' => 'open-mistral-nemo',
),
'ai_engine.name' => 'MistralAI',
),
Example of using OpenAI: :
'itomig-ai-base' => array (
'ai_engine.configuration' => array (
'api_key' => '***',
'url' => 'https://api.openai.com/v1/',
),
'ai_engine.name' => 'OpenAI',
),
Example of using Anthropic: :
'itomig-ai-base' => array (
'ai_engine.configuration' => array (
'api_key' => '***',
'url' => 'https://api.anthropic.com/v1/messages',
'model' => 'claude-3-5-sonnet-latest',
),
'ai_engine.name' => 'AnthropicAI',
),
Example of using Ollama: :
'itomig-ai-base' => array (
'ai_engine.configuration' => array (
'url' => 'https://127.0.0.1:11434/api/', // or wherever you have your ollama running
'model' => 'qwen2.5:14b', // see ollama.com/library for available models
),
'ai_engine.name' => 'OllamaAI',
),
Example of using OpenAI API against custom endpoint: :
'itomig-ai-base' => array (
'ai_engine.configuration' => array (
'api_key' => '***',
'url' => 'https://your.ollama-or-openwebui-server.com',
'model' => 'your-model-name', // e.g. llama3.1:latest, see ollama.com -> models
),
'ai_engine.name' => 'OpenAI',
),