Skip to content

ProxyAI can not be used with reasoning models #980

Open
@Harrington-Bend

Description

@Harrington-Bend

What happened?

The reasoning models o1, o1-mini and o3-mini by OpenAI changed their API.

They do not accept the max_tokens parameter anymore, instead they require a max_completion_tokens parameter.

ProxyAI cannot deal with this yet, so ProxyAI can not be used with the OpenAI reasoning models.

Relevant log output or stack trace

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

Steps to reproduce

  1. Send a request to any OpenAI reasoning model (o1, o1-mini or o3-mini)
  2. Observe the error message Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

CodeGPT version

3.2.2-241.1

Operating System

None

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions