Skip to content

Releases: settinghead/voxlert

v0.3.10 — pi integration, local LLM support, --onboard flag

20 Mar 00:39

Choose a tag to compare

What's New

pi coding agent support

  • New @settinghead/pi-voxlert package — install with pi install npm:@settinghead/pi-voxlert
  • Hooks into pi agent lifecycle: voice notifications on task completion and tool errors
  • Auto-setup flow: prompts to install + configure Voxlert on first use
  • /voxlert command with setup, test, and status subcommands
  • voxlert_speak tool — the LLM can speak phrases aloud on demand

Local LLM support

  • Ollama, LM Studio, and llama.cpp backends for fully offline phrase generation
  • No API keys required for the complete local pipeline

Improved onboarding

  • npx voxlert --onboard — single command to start the setup wizard
  • voxlert setup --yes — non-interactive setup for CI and automated flows

Other improvements

  • In-memory LRU cache for TTS WAV output
  • Cost FAQ in README
  • Configuration saved at key setup steps to avoid losing progress

Install

npm install -g @settinghead/voxlert
voxlert setup

Or try without installing: npx voxlert --onboard

Demo: https://youtu.be/5xFXGijwJuk