Skip to content

Commit 901fd35

Browse files
release: v0.2.1
## Description This PR delivers a focused update improving provider model management, fixing API key persistence issues, enhancing Whisper model support, and bundling FFmpeg at build time for a smoother first-run experience. Key highlights include: * Dynamic real-time model fetching from OpenAI, Anthropic, and Groq (when API key is provided) * Build-time FFmpeg bundling with 3-tier fallback (bundled → dev → system) * Support for Q5_1 quantized Whisper models (tiny, base, small) * Centralized provider API key management * Proper model and endpoint state preservation when switching providers * Multiple Whisper metadata corrections * Windows system audio device reset fix * Transcript copy newline preservation * UI improvements to model selection and notifications --- ## Related Issue Fixes #307 Fixes #346 Fixes #322 Fixes #323 (Also includes fixes reported by community members in related provider/model issues.) --- ## Type of Change * [x] Bug fix * [x] New feature * [ ] Documentation update * [x] Performance improvement * [x] Code refactoring * [ ] Other (please describe) --- ## Testing * [x] Unit tests added/updated * [x] Manual testing performed * [x] All tests pass Tested scenarios include: * Provider switching with API key persistence * Model selection preservation per provider * Dynamic model fetching with valid API keys * Custom OpenAI-compatible endpoint connection testing * Whisper model downloads and metadata validation * Q5_1 model loading and transcription * FFmpeg resolution (bundled/dev/system fallback) * Windows system audio device persistence * Transcript copy with preserved newlines --- ## Documentation * [x] Documentation updated * [ ] No documentation needed Updated: * Release notes --- ## Checklist * [x] Code follows project style * [x] Self-reviewed the code * [x] Added comments for complex code * [x] Updated README if needed * [x] Branch is up to date with devtest * [x] No merge conflicts --- ## Additional Notes * FFmpeg download source switched to Zackriya’s GitHub release for improved Windows reliability. * Model fetching now occurs dynamically when opening the model selector and an API key is present. * Provider-specific model selection and endpoints are now cached and automatically restored. * Includes contributions from: * @matbe (Windows audio fix, Whisper metadata corrections) * @5m4u66y (OpenAI-compatible API connection test fix)
2 parents be8e5cb + b0247cb commit 901fd35

38 files changed

+1648
-324
lines changed

.github/workflows/build-macos.yml

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -139,14 +139,20 @@ jobs:
139139
run: |
140140
echo "Building llama-helper sidecar with Metal support..."
141141
cargo build --release -p llama-helper --features metal
142-
142+
143143
# Copy binary to binaries directory
144144
mkdir -p frontend/src-tauri/binaries
145145
cp target/release/llama-helper frontend/src-tauri/binaries/llama-helper-aarch64-apple-darwin
146-
146+
147147
echo "Copied llama-helper to frontend/src-tauri/binaries/"
148148
ls -la frontend/src-tauri/binaries/
149149
150+
- name: Cache FFmpeg binary
151+
uses: actions/cache@v4
152+
with:
153+
path: frontend/src-tauri/binaries/ffmpeg-*
154+
key: ${{ runner.os }}-ffmpeg-${{ hashFiles('frontend/src-tauri/build.rs', 'frontend/src-tauri/build/ffmpeg.rs') }}
155+
150156
- name: Build Tauri app (with code signing)
151157
if: ${{ github.event.inputs.sign-build == 'true' }}
152158
uses: tauri-apps/tauri-action@v0

.github/workflows/build-windows.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -660,6 +660,12 @@ jobs:
660660
Write-Host "Copied llama-helper to frontend/src-tauri/binaries/"
661661
Get-ChildItem "frontend/src-tauri/binaries/"
662662
663+
- name: Cache FFmpeg binary
664+
uses: actions/cache@v4
665+
with:
666+
path: frontend/src-tauri/binaries/ffmpeg-*.exe
667+
key: ${{ runner.os }}-ffmpeg-${{ hashFiles('frontend/src-tauri/build.rs', 'frontend/src-tauri/build/ffmpeg.rs') }}
668+
663669
- name: Build Tauri app
664670
uses: tauri-apps/tauri-action@v0
665671
env:

.github/workflows/build.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -560,6 +560,12 @@ jobs:
560560
echo "Copied llama-helper to frontend/src-tauri/binaries/llama-helper-${TARGET}"
561561
ls -la frontend/src-tauri/binaries/
562562
563+
- name: Cache FFmpeg binary
564+
uses: actions/cache@v4
565+
with:
566+
path: frontend/src-tauri/binaries/ffmpeg-*
567+
key: ${{ runner.os }}-ffmpeg-${{ hashFiles('frontend/src-tauri/build.rs', 'frontend/src-tauri/build/ffmpeg.rs') }}
568+
563569
- name: Build with Tauri
564570
id: tauri-build
565571
uses: tauri-apps/tauri-action@v0

README.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -106,12 +106,11 @@ Whether you're a defense consultant, enterprise executive, legal professional, o
106106
### 🪟 **Windows**
107107

108108
1. Download the latest `x64-setup.exe` from [Releases](https://github.com/Zackriya-Solutions/meeting-minutes/releases/latest)
109-
2. Right-click the downloaded file → **Properties** → Check **Unblock** → Click **OK**
110-
3. Run the installer (if Windows shows a security warning: Click **More info****Run anyway**)
109+
2. Run the installer
111110

112111
### 🍎 **macOS**
113112

114-
1. Download `meetily_0.2.0_aarch64.dmg` from [Releases](https://github.com/Zackriya-Solutions/meeting-minutes/releases/latest)
113+
1. Download `meetily_0.2.1_aarch64.dmg` from [Releases](https://github.com/Zackriya-Solutions/meeting-minutes/releases/latest)
115114
2. Open the downloaded `.dmg` file
116115
3. Drag **Meetily** to your Applications folder
117116
4. Open **Meetily** from Applications folder

frontend/package.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "meetily",
3-
"version": "0.2.0",
3+
"version": "0.2.1",
44
"private": true,
55
"main": "electron/main.js",
66
"scripts": {
@@ -37,6 +37,7 @@
3737
"@radix-ui/react-dialog": "^1.1.14",
3838
"@radix-ui/react-dropdown-menu": "^2.1.16",
3939
"@radix-ui/react-label": "^2.1.7",
40+
"@radix-ui/react-popover": "^1.1.15",
4041
"@radix-ui/react-progress": "^1.1.8",
4142
"@radix-ui/react-scroll-area": "^1.2.9",
4243
"@radix-ui/react-select": "^2.2.5",
@@ -68,6 +69,7 @@
6869
"@types/lodash": "^4.17.13",
6970
"class-variance-authority": "^0.7.1",
7071
"clsx": "^2.1.1",
72+
"cmdk": "^1.1.1",
7173
"date-fns": "^4.1.0",
7274
"framer-motion": "^11.15.0",
7375
"lodash": "^4.17.21",

frontend/src-tauri/Cargo.toml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[package]
22
name = "meetily"
3-
version = "0.2.0"
3+
version = "0.2.1"
44
description = "A Tauri App for meeting minutes"
55
authors = ["Sujith S"]
66
license = "MIT"
@@ -56,6 +56,9 @@ openmp = ["whisper-rs/openmp"] # OpenMP parallel processing
5656
tauri-build = { version = "2.3.0", features = [] }
5757
reqwest = { version = "0.11", features = ["blocking", "multipart", "json", "stream"] }
5858
which = "6.0.1"
59+
zip = "2.2" # ZIP extraction (Windows, macOS)
60+
tar = "0.4" # TAR extraction (Linux)
61+
xz2 = "0.1" # XZ decompression (Linux)
5962

6063

6164
[dependencies]

frontend/src-tauri/build.rs

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
#[path = "build/ffmpeg.rs"]
2+
mod ffmpeg;
3+
14
fn main() {
25
// GPU Acceleration Detection and Build Guidance
36
detect_and_report_gpu_capabilities();
@@ -11,6 +14,10 @@ fn main() {
1114
// Let the enhanced_macos crate handle its own Swift compilation
1215
// The swift-rs crate build will be handled in the enhanced_macos crate's build.rs
1316
}
17+
18+
// Download and bundle FFmpeg binary at build-time
19+
ffmpeg::ensure_ffmpeg_binary();
20+
1421
tauri_build::build()
1522
}
1623

0 commit comments

Comments
 (0)