Open
Description
What happened?
The reasoning models o1, o1-mini and o3-mini by OpenAI changed their API.
They do not accept the max_tokens
parameter anymore, instead they require a max_completion_tokens
parameter.
ProxyAI cannot deal with this yet, so ProxyAI can not be used with the OpenAI reasoning models.
Relevant log output or stack trace
Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
Steps to reproduce
- Send a request to any OpenAI reasoning model (o1, o1-mini or o3-mini)
- Observe the error message
Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
CodeGPT version
3.2.2-241.1
Operating System
None