Conversation
|
Hi, thanks for the contribution! I am indeed considering adding and maintaining responses API. If so, the right way to implement this is through a different provider within vim-ai. But I am still not sure if responses are going to be widely adapted like chat API is. As far as I know responses are still in beta on OpenRouter and just partially implemented in ollama. So maybe the best way for now would be extracting it into an external provider, and merge into vim-ai when it matures. |
|
I tried that external provider and it didn't work for me; latest GPT release again works with Chat API, so it's a bit of a ping-pong when Codex API for coding is needed for the latest models. Though in principle, it's touted to be more suited for coding tasks, but maybe rather with an agent and not a chat interface. In any case, aforesaid provider was not reliable and if it's there to stay, it could stay inside the plug-in itself as well, but as any PR, it's a suggestion, not an obligation |
|
Well I am not talking about using the existing responses provider which has it's limitations. Rather implementing a new one that will work without 3rd party python dependencies. |
|
Oh, making Vim-AI work without Python dependencies would be convenient. In principle, it seems possible |
|
I think I don't understand, vim-ai does not have any python dependencies |
|
Interesting, I always had trouble using vim-ai, say in Git Bash due to missing |
|
Yes, python is required, 3rd party python modules not |
|
Closing as this has to be implemented as internal or external provider |
In the hope that the stance in #162 on solely using the chat API has been softened