Skip to content

json_schema structured output type not supported in gpt-4o assistants #1857

Open
@ghost

Description

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Not sure if this is a bug in the docs or the code. The API reference for the response_format parameter of assistants.create has:

Specifies the format that the model must output. Compatible with GPT-4o, GPT-4 Turbo, and all GPT-3.5 Turbo models since gpt-3.5-turbo-1106.

Setting to { "type": "json_schema", "json_schema": {...} } enables Structured Outputs which ensures the model will match your supplied JSON schema.

But I am seeing this error message:

"Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version gpt-4o."

>>> from pydantic import BaseModel
>>> 
>>> class Project_Spending(BaseModel):
...     Authority_Name: str
...     Project_Count: int
...     Total_spending: int
... 
>>> client.beta.assistants.create(
...     name = "prjdata",
...     tools = [{'type':'file_search'}],
...     model = "gpt-4o",
...     description = "find the required data from the attachment",
...     response_format={
...         'type': 'json_schema',
...         'json_schema':
...         {
...             "name": "Project_Spending",
...             "schema": Project_Spending.model_json_schema()
...         }
...     }
... )
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/resources/beta/assistants.py", line 146, in create
    return self._post(
           ^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 1277, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 954, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 1058, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version `gpt-4o`.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}
>>> 

To Reproduce

Call OpenAI().beta.assistants.create() with a response_format parameter

Code snippets

No response

OS

macOS

Python version

Python 3.12.7

Library version

openai v1.52.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions