How hard is it to wrap it into a proxy server that implements OpenAI-compatible protocol?
Benefits:
- It will be enough to integrate with all popular AI coding agents since all of them support OpenAI-compatible models.
- It will be possible to host it remotely, this will eliminate local space and hardware requirements
The required protocol itself seems rather simple and has only 4 methods https://lmstudio.ai/docs/api/endpoints/openai