Skip to content

taikun114/Mocolamma

Repository files navigation

Mocolamma

English | 日本語

Mocolamma Ollama Manager App

GitHub Repo stars   GitHub Release   GitHub Downloads (all assets, all releases)

Buy Me a Coffee

Table of Contents

What is Mocolamma?

Introduction

Mocolamma is an Ollama management application for macOS and iOS / iPadOS that connects to Ollama servers to manage models and perform chat tests using models stored on the Ollama server.

Note

Generative AI was used in the development of Mocolamma. Therefore, the code might not follow best practices or may contain unstable code.
Mocolamma is an unofficial app for Ollama and is not affiliated with Ollama in any way.

Origin of the name

The name "Mocolamma" is a coined word combining "Model", "Control", "Ollama", and "Manage".
I wanted it to be easy to read and remember, while having a meaningful name, so this word came to be.

Download

macOS Version

Mocolamma can be downloaded for free from the releases page, or purchased for $1.99 on the Mac App Store.

iOS / iPadOS Version

Mocolamma can be purchased for $1.99 on the App Store.

Tip

Purchasing Mocolamma on the App Store for macOS, iOS / iPadOS, or visionOS will enable access from all platforms with a single purchase!

Note

App Store prices may vary by region and could be automatically adjusted due to exchange rates. These prices are based on the Japanese yen prices of 250 yen, but actual prices may differ depending on your region.

System Requirements

Mocolamma supports Intel-based Macs and Apple Silicon Macs running macOS Sonoma (14.0) or later, and iPhones and iPads running iOS / iPadOS 17.0 or later, as well as Apple Vision Pro running visionOS 2.0 or later as an iPad app.

Note

I do not own an Apple Vision Pro, and have only tested on the simulator without real device testing, so there's a possibility of unintended behavior on visionOS.
In my development environment, despite having the visionOS 1.X simulator installed and properly configured, it doesn't appear in the simulator device list, so I can't confirm if Mocolamma works on visionOS 1.X.
Mocolamma's development settings should support visionOS 1.X as well, so I believe Mocolamma can be installed on visionOS 1.X, but since I haven't been able to test on the simulator, even if installation is successful, there's a possibility it may not work correctly.

Differences between free and paid versions

Mocolamma has both a free version (GitHub version, macOS only) and a paid version (App Store version), but both have almost identical functionality. The only difference is the availability of automatic updates.
The paid version can automatically update and install using App Store functionality, while the free version currently lacks update check and installation features (meaning the app's functionality is entirely unchanged!).
Additionally, the App Store version has had the donation link removed from the about the app screen to pass Apple's review process. There are absolutely no other differences!

I would appreciate it if you purchase from the App Store, but feel free to download for free first, and if you find it very useful, please consider purchasing or donating!

Features

Mocolamma can connect to Ollama servers on networks to manage models and perform simple chats using the models.

Server Tab

Server Tab

From the server tab, you can easily manage Ollama servers by adding, editing, and more. By default, the localhost server is registered on macOS, so if you're running the Ollama server on the Mac where you opened Mocolamma, you can start using it immediately without additional server setup.

Model Tab

Model Tab

From the model tab, you can view models stored on the selected server and add models to the selected server. By opening the inspector, you can check specific model details as well.

Chat Tab

Chat Tab

The chat tab allows you to have simple chats using models stored on the selected server. This is merely a simple chat usable for model testing, so it doesn't have message saving functionality or detailed parameter settings, but it's convenient for casually testing downloaded models.
For advanced chat features, I recommend using the official Ollama app or specialized chat-focused applications.

Privacy and Security

Mocolamma does not collect any information about users.
Usage data and crash reports of users who have enabled "Share with App Developers" in the system settings "Analytics & Improvements" may be shared, but the app itself doesn't have any functionality to collect and send information. You can安心 use it even if you're concerned about privacy.

Support and Feedback

Bug Reports

Mocolamma is an app developed using generative AI. Although extensive testing was performed during development, bugs may remain, or some functions may not work properly.

If you find bugs or operational issues, please check the already opened Issues (known bugs or issues) and look for the same problems already reported by others. If you can't find the same issue, please open a new Issue to report the problem.
To make bug tracking easier, please open one Issue per issue if you want to report multiple issues. In other words, if you want to report two bugs, you need to open two Issues.

Feedback

If you want to submit bug reports, idea sharing, or messages to the developer (me) as someone without a GitHub account, please click this link or send an email through the "Send Feedback" button on the about screen (accessed from "About Mocolamma" on macOS, or from the information button in the settings tab on iOS / iPadOS / visionOS) (please note that I may not be able to reply to all messages).
Opening the email sending screen from the in-app button will pre-fill necessary information such as system information (device model and OS version) and app version information, so I recommend sending from there.

Community

A Discussions page is available where you can share new features you'd like added to the app, ask questions about potential issues, or exchange opinions with others.
Please make good use of it as a place for information exchange. I often look at it too, so messages to the developer are welcome!

Support Developer

Give a Star

Please click the "Star" button in the upper right corner at this page to give it a star!
This button is like a thumbs up button and keeps me motivated to continue development! This feature is free, so please give it a star if you like Mocolamma!

Donate

If you like Mocolamma, please donate to support continued development!

You can donate using the following services.

Buy Me a Coffee

Support from the cost of a cup of green tea at Buy Me a Coffee.

Buy Me a Coffee

PayPal.Me

If you have a PayPal account, you can donate directly through PayPal.

Credits

Ollama by Ollama

Mocolamma is an app specialized in managing and operating Ollama servers and models. Without Ollama, Mocolamma would not have emerged.

Gemini CLI by Google / Qwen Code by Qwen / opencode by SST

These excellent generative AI tools were used in the development of Mocolamma. For someone like me with no knowledge of programming including Swift, I wouldn't have been able to complete this app without the power of generative AI.

MarkdownUI by Guillermo Gonzalez

The MarkdownUI package was used to implement the Markdown rendering in the chat screen. Thanks to this package, I was able to implement beautiful Markdown rendering very easily.

CompactSlider by Alexey Bukhtin

The CompactSlider package was used to implement the sliders for temperature and context window in chat settings. Thanks to this package, I was able to implement beautiful customized sliders.

create-dmg by Andrey Tarantsov and Andrew Janke

The create-dmg shell script was used to create the disk image for distributing the free version. Thanks to this shell script, I was able to easily create customized disk images.

About

Unofficial Ollama manager app for macOS, iOS, iPadOS, and visionOS, featuring server management, model management, and simple chat feature

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

Languages