·
3 commits
to main
since this release
Support one click running local AI models on Windows:
- Now support Phi 4 mini reasoning, Phi 4, Gemma 3, Mistral 2503, Qwen 3, DeepSeek Distill Llama and llama 3.2 on Windows.
- Models of different sizes are built in based on the size of the local machine GPU.
- Add papersgpt agent for complex tasks on Windows.
- Compatible with ollama, if use ollama, please set Customized API URL to http://localhost:11434/api/chat in the selection of Customized.