Open
Description
Describe the issue
In instances where a local model maybe used (ollama) this means you need to wait for the model to come fully online.
I expect it would be better to have a http handler manage this without constraints on model availability.
Steps to Reproduce
setup a model in ollama, call codegate cli (perhaps workspaces)
Operating System
MacOS (Arm)
IDE and Version
N/A
Extension and Version
N/A
Provider
Ollama
Model
any
Codegate version
main
Logs
No response
Additional Context
No response