A GNOME user interface to Ollama. Written in Python.
Supports multiple tabs of model responses to Ollama endpoint /api/generate and /api/chat
Select "New Response" to create a new generate tab using /api/generate.
Select "New Chat" to create a new chat tab using /api/chat with previous messages so context is preserved.
Displays the "Thinking" stream in a fieldset in the response bubble.
Support saving chats and their options between sessions and a list of chats in a sidebar.
Both tab types support selecting multiple images to pass to the model.
Responses render Markdown and implement code highlighting.
Each tab supports entering any host. Model selection options populated from host.
Model selection, thinking, system prompt, statistics, logprobs, and other options (e.g. temperature)
gnollama can be built and run with GNOME Builder.
- Open GNOME Builder
- Click the Clone Repository button
- Enter
https://github.com/jackrabbithanna/gnollama.gitin the field Repository URL - Click the Clone Project button
- Click the Run button to start building application
Requires python3 and markdown Code highlighting requires GTKSourceView version 5
To install in Ubuntu:
apt-get install libgtksourceview-5-0 libgtksourceview-5-common libgtksourceview-5-dev
apt-get install gir1.2-gtksource-5
apt-get install python3-markdown python3-gimeson setup build
meson compile -C build
meson install -C buildYou can then run gnollama to execute the application.
Manage multiple hosts in configuration
Manage models for each host
More UI Multi-lingual translations
The GNOME Code of Conduct is applicable to this project
gnolamma is released under the terms of the GNU General Public License V3.
No warranty provided. No guarantee it does anything at all. Use at your own risk.

