Local LLM Power-Up & Self-Hosting Improvements
This release supercharges Note Companion with local AI capabilities and smoother self-hosting! You can now use your own local LLM (Ollama) for file classification, tag, and folder suggestions—no external server required. Plus, self-hosted users enjoy a streamlined experience with license checks automatically bypassed. Enjoy a more private, flexible, and powerful Note Companion!
Technical Changes
- Added support for using local LLMs (Ollama) for file classification, tag, and folder suggestions when enabled in settings.
- Integrated local LLM logic into organizer and chat components, matching cloud/local model selection.
- Improved error handling for local LLM operations to ensure no fallback to external servers in local-only mode.
- License validation now automatically bypassed when self-hosting is enabled, making setup easier for self-hosted users.
- Massive CSS update: added modern utility classes for improved UI consistency and accessibility.
- Bumped plugin version to 3.5.2.
SHA-256 Checksums
c3ad26a9afb69d0901bc479b1aca564b93ba3aa662f41a539d85e58c1cfbf0e1 main.js
6087da77e4945611464635606ada9f2e55e273c905347416c089c56af9aa1c19 styles.css
296e707a6d3a927a38864261ac207ce7e0f4ca56268822253701dde9eb0ae458 manifest.json