the terminal client for Ollama.
- intuitive and simple terminal UI, no need to run servers, frontends, just type
oterm
in your terminal. - supports Linux, MacOS, and Windows and most terminal emulators.
- multiple persistent chat sessions, stored together with system prompt & parameter customizations in sqlite.
- support for Model Context Protocol (MCP) tools & prompts integration.
- can use any of the models you have pulled in Ollama, or your own custom models.
- allows for easy customization of the model's system prompt and parameters.
- supports tools integration for providing external information to the model.
uvx oterm
See Installation for more details.
oterm
is now part of Homebrew!- Support for "thinking" mode for models that support it.
- Support for streaming with tools!
- Messages UI styling improvements.
- MCP Sampling is here in addition to MCP tools & prompts! Also support for SSE & WebSocket transports for MCP servers.
The splash screen animation that greets users when they start oterm.
A view of the chat interface, showcasing the conversation between the user and the model.
The model selection screen, allowing users to choose and customize available models.
oterm using the
git
MCP server to access its own repo.
The image selection interface, demonstrating how users can include images in their conversations.
oterm supports multiple themes, allowing users to customize the appearance of the interface.
This project is licensed under the MIT License.