Releases: Quantisan/gremllm
v0.4.0 - Chat Polish & Architecture Refinement
This release focuses on improving the chat experience and strengthening FCIS (Functional Core, Imperative Shell) architecture.
Features
- Topic deletion with native confirmation dialog (#128, #129)
- Markdown rendering for assistant messages with XSS protection (#123)
- Multi-line chat input with Shift+Enter for newlines (#122)
- New Window menu action with Cmd+N shortcut (#131)
- Spinning "computing" indicator during LLM responses (#124)
- Textarea focus maintenance after submission (#125)
Architecture & Refactoring
- Standardize async IPC handlers through action registry (#130)
- Decouple DOM effects into pure actions with generic placeholders (#126)
- Consolidate DOM effect registrations in imperative shell (#127)
Full Changelog: v0.3.0...v0.4.0
v0.3.0 - Multi-Provider LLM Support
🎉 Major Features
Multi-Provider LLM Support (#121)
- Added support for three major LLM providers: Claude (Anthropic), GPT (OpenAI), and Gemini (Google)
- Configure API keys for any or all providers through the Settings interface
- Secure encrypted storage for API keys (with session-only fallback when unavailable)
- Environment variable override support for development (
ANTHROPIC_API_KEY,OPENAI_API_KEY,GOOGLE_API_KEY)
Model Selection (#112)
- New model selector UI in the chat interface
- Choose from Claude 4.5 Sonnet, GPT-5, or Gemini 2.5 Flash
- Model selection automatically determines the correct provider
Auto-Save (#97)
- Topics now auto-save after every message exchange with the LLM
- Topic name changes trigger automatic saves
- No more manual save steps needed
🔧 Technical Improvements
- Normalized LLM response handling across all providers at IPC boundary (#111)
- Centralized model and provider logic in schema for single source of truth (#114)
- Eliminated duplicate model state between form and topic (#113)
- Separated unit tests from integration tests for faster CI (#107)
- Added comprehensive test coverage for LLM responses, topic operations, and workspace flows (#103)
- Improved error logging for LLM requests (#104)
📦 Dependencies
- Updated Electron to v38
- Updated @electron/fuses to v2
- General dependency updates
Note: This release transforms Gremllm from an Anthropic-only app into a multi-provider LLM interface, with the foundation to easily add more providers in the future.
v0.2.0
Highlights
- Multi‑topic workspaces with a functional topics tree and topic listing.
- Unsaved change indicator: topics show an asterisk when there are unsaved edits.
- Double click to rename topics from the UI.
- Welcome screen when no workspace is open.
- “Open Folder” flow: menu item and welcome‑screen button to choose a workspace.
- Load an entire workspace folder when opening/bootstrapping.
- Workspace name shown in the UI.
Bug fixes
- Topic files are saved with the correct data schema (improves reliability of saved files).
Breaking behaviour changes
- No default workspace is loaded at startup; users must choose a workspace via Open Folder.
- Topic filenames now use the topic ID.
v0.1.1
This release fixes a bug where the Submit button didn't work.
Instruction:
I don't have an Apple Developer ID yet, so the .dmg doesn't work on newer macOS. To try this release, you'll need to build it locally on your machine.
npm install
npm run dev
v0.1.0
This is the first release of Gremllm. It provides a simple chat interface that automatically saves your conversation history locally. Your API key is also stored securely on your machine using native OS encryption, so all your secrets stays private.
Instruction:
I don't have an Apple Developer ID yet, so the .dmg doesn't work on newer macOS. To try this release, you'll need to build it locally on your machine.
npm install
npm run dev