Bug Description
The application fails to generate a summary when using the local Gemma 3 4B (Balanced) model. The software is unable to initialize or load the specific .gguf file from the local storage path.
Current Behavior
When clicking the "Generate Summary" button, the process terminates immediately with the following error message:
Generation failed: Failed to load model: unable to load model at "C:/Users/<>/AppData/Roaming/com.meetily.ai/models/summary/gemma-3-4b-it-Q4_K_M.gguf"
Expected Behavior
The software should successfully load the Gemma 3 model into memory and process the provided transcript to generate a summary.
Steps to Reproduce
Open the application.
Go to the AI Model selection settings.
Select Gemma 3 4B (Balanced).
Press Save.
Navigate to a transcript and press Generate Summary.
Observe the "Generation failed" error.
Environment
OS: Windows (based on file path)
Browser: N/A (Desktop App)
Version: v0.3.0
Model: Gemma 3 4B (Balanced)
Storage Path: AppData/Roaming/com.meetily.ai/models/summary/
Screenshots/Videos
Error Messages
Plaintext
Generation failed: Failed to load model: unable to load model at "C:/Users/<>/AppData/Roaming/com.meetily.ai/models/summary/gemma-3-4b-it-Q4_K_M.gguf"
Additional Context
The issue specifically occurs with the Gemma 3 4B (Balanced) local option. Other models (if available) or cloud-based options may not trigger this specific path-loading error. This suggests a potential corruption of the .gguf file or an issue with the model loader's compatibility with this specific quantization.
Possible Solution
Model Re-download: Delete the gemma-3-4b-it-Q4_K_M.gguf file from the specified path and allow the application to re-download it to ensure the file is not corrupted.
Path Permissions: Verify that the application has full read/write permissions for the Roaming/com.meetily.ai directory.
Checklist
[X] I have searched for similar issues
[X] I have provided all required information
[X] I have included screenshots/videos if applicable
[X] I have included error messages if applicable
Bug Description
The application fails to generate a summary when using the local Gemma 3 4B (Balanced) model. The software is unable to initialize or load the specific .gguf file from the local storage path.
Current Behavior
When clicking the "Generate Summary" button, the process terminates immediately with the following error message:
Generation failed: Failed to load model: unable to load model at "C:/Users/<>/AppData/Roaming/com.meetily.ai/models/summary/gemma-3-4b-it-Q4_K_M.gguf"
Expected Behavior
The software should successfully load the Gemma 3 model into memory and process the provided transcript to generate a summary.
Steps to Reproduce
Open the application.
Go to the AI Model selection settings.
Select Gemma 3 4B (Balanced).
Press Save.
Navigate to a transcript and press Generate Summary.
Observe the "Generation failed" error.
Environment
OS: Windows (based on file path)
Browser: N/A (Desktop App)
Version: v0.3.0
Model: Gemma 3 4B (Balanced)
Storage Path: AppData/Roaming/com.meetily.ai/models/summary/
Screenshots/Videos
Error Messages
Plaintext
Generation failed: Failed to load model: unable to load model at "C:/Users/<>/AppData/Roaming/com.meetily.ai/models/summary/gemma-3-4b-it-Q4_K_M.gguf"
Additional Context
The issue specifically occurs with the Gemma 3 4B (Balanced) local option. Other models (if available) or cloud-based options may not trigger this specific path-loading error. This suggests a potential corruption of the .gguf file or an issue with the model loader's compatibility with this specific quantization.
Possible Solution
Model Re-download: Delete the gemma-3-4b-it-Q4_K_M.gguf file from the specified path and allow the application to re-download it to ensure the file is not corrupted.
Path Permissions: Verify that the application has full read/write permissions for the Roaming/com.meetily.ai directory.
Checklist
[X] I have searched for similar issues
[X] I have provided all required information
[X] I have included screenshots/videos if applicable
[X] I have included error messages if applicable