LLM FlatChat is a small chat client built with Flet. It connects to a KoboldCPP server and stores your settings locally in storage/settings.json. The app lets you swap between a chat view and a settings view via the navigation drawer.
- Python 3.11+
- Flet (installed via
pip install flet[all])
After installing the requirements, you can run the app by executing the following command in the project directory:
flet runStart your KoboldCPP server first. Open the settings view to enter your name above the KoboldCPP URL and adjust the API endpoint if needed. The URL is persisted in storage/settings.json and diary entries are stored in storage/diary.db.
Flet can package the application for multiple platforms:
flet build apk -v # Android
flet build ipa -v # iOS
flet build macos -v # macOS
flet build linux -v # Linux
flet build windows -v # WindowsSee the Flet publish guide for signing and distribution details.
- Chat with KoboldCPP via a clean Flet UI
- Chat history context awareness
- Persistent settings (KoboldCPP URL and LLM parameters)
- Simple diary stored in an SQLite database
- Diary view bottom bar with command and save buttons
- Self-reflection question command powered by KoboldCPP
- Previous entry summary command powered by KoboldCPP
- Diary questions stream live as they are generated
- Configurable self-reflection prompts and parameters
- View saved diary entries sorted by date
- Tap a saved entry to reopen and edit it
- Delete saved entries with a long press and confirmation popup
- User data is stored under the
storage/folder
The application code lives under the src folder and is split into two main
packages:
backend/– Contains logic unrelated to the UI.backend.pystores chat history and talks to the KoboldCPP API,models.pydefines theMessagedataclass, whilesettings_manager.pyloads and saves the persistentAppSettings.frontend/– All Flet UI components.app.pywires everything together,chat_view.pydisplays the message list,chat_message.pyrenders individual messages, andsettings_view.pyhosts the settings form.
Execution begins in main.py which creates the backend and loads settings
before handing control to Flet.
Development guidelines and testing instructions are described in
AGENTS.md. To run the unit tests yourself install the project with
the optional development dependencies and execute:
pip install -e .[dev]
python -m pytestOffline documentation for Flet lives under docs/flet-docs and can be consulted without internet access. Langchain documentation resides under docs/langchain-docs and currently includes the LangMem module.
-
Settings menu
- Settings menu to set LLM URL (KoboldCPP URL)
- Settings for LLM system prompt
- Settings for LLM temperature
- Settings for LLM max tokens
- Diary question prompt and parameters
-
Chat
- Bring in last messages as context
- Make past messages of LLM or user editable
- Let LLM regenerate last message
- Save chat
- Long term memory via Langchain (
LangMem) - fix streaming messages, despite
"stream": Truein backend.py it does not work - Read and print which model is available (
curl http://localhost:5001/v1/modelsgives a list of available models) - Markdown rendering
-
Diary functionality, to let users write and save diary entries (Can be used later on as context for the LLM through Langchain)
- Create diary button in navigation drawer which leads into a diary view
- Give diary view a text editor style to write diary entries
- Command popup to trigger LLM self-reflection questions
-
Frontend general
- Improve styling
-
User functionality
- User name stored in settings
- Chat
- Chat messages are not correctly displayed if being long. We need to wrap them at the end of the display
- Diary
- Inputed text is not temporarily saved when switching to chat view (annoying when you forget to save and switch to chat view)
- Self-reflection question popup does show if the text is very long
- Settings menu
- Temperature slider are not labelled
- Not automatically saving settings when changing and not pressing "Save" (annoying when you forget to save and switch to chat view)