llumen v0.1.1 - Ignite LLM Chats with Simplicity! 🚀
Welcome to the initial release of llumen!
Built with Rust for the backend and SvelteKit for the frontend, llumen starts in under 1 second and sips less than 10 MiB of disk space. Dive in, chat with models, and explore modes like web-search-enabled conversations—all out of the box.
🌟 Key Features
- Single API Key Magic: Just plug in your OpenRouter key for full LLM access—no extras needed for search, OCR, embeddings, or more.
- Blazing Fast & Lean: Startup in <1s, tiny footprint (<10 MiB).
-
Rich Chat Experience: Markdown rendering with code blocks and math support (
$E=mc^2$ looks crisp!). Multiple modes: normal chats, web-search enabled, and upcoming deep-research/agentic features (WIP 🚧). - Cross-Platform Ready: Windows 🪟 executables, Linux binaries, and Docker 🐳 images for seamless setup.
📝 Changelog
- Initial release: Core backend (Rust) and frontend (SvelteKit) integration.
- Full LLM chat functionality with OpenRouter support(File upload/image upload/search).
- Markdown rendering, multi-mode chats, and static/distroless docker builds.
- Screenshots and docs for easy onboarding.