Anthropic compatible api support #3544
CaiDingxian
started this conversation in
1. Feature requests
Replies: 1 comment
-
|
+1 This ☝️ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Currently, many LLM providers have implemented compatibility with Anthropic's protocol, similar to OpenAI-compatible APIs. However, in KiloCode's Anthropic adapter, the model names are hardcoded and do not allow custom model identifiers such as MinMax-M2, claude-haiku-4-5, or claude-sonnet-4-5-20250929.
In practice, I've found that the OpenAI-compatible API cannot apply token caching for certain models using the Anthropic protocol.
By manually editing packages\types\src\providers\anthropic.ts and src\api\providers\anthropic.ts to hardcode new models, I was able to get them working effectively.
However, I would prefer official support for custom models, which would enable better compatibility with the latest official models as well as third-party models.
Beta Was this translation helpful? Give feedback.
All reactions