Using OpenAI compatible llm to fix grammar #25
Replies: 2 comments
-
|
I just realized your solution is shell script based so my question is more about if you think there should be rather a new shell script example for such a feature, or it should be built in and be part of the configuration? |
Beta Was this translation helpful? Give feedback.
-
|
The post-processor is intentionally a shell command hook - this keeps voxtype offline-first while letting users opt into cloud processing. I'll add example scripts for popular cloud providers including Gemini. I'd be interested to see your take on this too @materemias - if you'd like to work on it, I'm happy to make the space for it and let you run with it |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Pete,
I noticed your recent work on improving grammar and transcription with Ollama. I was considering doing something similar using a cloud LLM for faster processing – Gemini 3 Flash looks promising due to its efficiency, speed, and cost-effectiveness.
Are you planning to implement a cloud-based solution, or should I explore this myself to avoid duplicating efforts? I'd prefer to check if it's already on your roadmap.
Beta Was this translation helpful? Give feedback.
All reactions