Skip to content

llumen v0.1.1 - Ignite LLM Chats with Simplicity! 🚀

Compare
Choose a tag to compare
@Eason0729 Eason0729 released this 27 Sep 16:22
· 74 commits to main since this release

Welcome to the initial release of llumen!

Built with Rust for the backend and SvelteKit for the frontend, llumen starts in under 1 second and sips less than 10 MiB of disk space. Dive in, chat with models, and explore modes like web-search-enabled conversations—all out of the box.

🌟 Key Features

  • Single API Key Magic: Just plug in your OpenRouter key for full LLM access—no extras needed for search, OCR, embeddings, or more.
  • Blazing Fast & Lean: Startup in <1s, tiny footprint (<10 MiB).
  • Rich Chat Experience: Markdown rendering with code blocks and math support ($E=mc^2$ looks crisp!). Multiple modes: normal chats, web-search enabled, and upcoming deep-research/agentic features (WIP 🚧).
  • Cross-Platform Ready: Windows 🪟 executables, Linux binaries, and Docker 🐳 images for seamless setup.

📝 Changelog

  • Initial release: Core backend (Rust) and frontend (SvelteKit) integration.
  • Full LLM chat functionality with OpenRouter support(File upload/image upload/search).
  • Markdown rendering, multi-mode chats, and static/distroless docker builds.
  • Screenshots and docs for easy onboarding.