Skip to content

Structured outputs response_format requires strict function calling JSON Schema? #1733

Open
@moonbox3

Description

@moonbox3

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

I am using the OpenAI Python 1.47.0 library and the model gpt-4o-2024-08-06. I've got the json_schema response format working with Pydantic/Non-Pydantic models (non-pydantic meaning I manually create the proper response format JSON schema) without tool calling. However, when I attempt to send tools with the payload to the method:

client.beta.chat.completions.parse(...)

I am getting a 400 because the tool's JSON schema does not have strict/additionalProperties.

The error shows as:

ValueError('`weather-get_weather_for_city` is not strict. Only `strict` function tools can be auto-parsed')

When I do add the strict: True and additionalProperties: False, I get a 200:

{
    "type": "function",
    "function": {
        "name": "weather-get_weather_for_city",
        "description": "Get the weather for a city",
        "parameters": {
            "type": "object",
            "properties": {
                "city": {
                    "type": "string",
                    "description": "The input city"
                }
            },
            "required": ["city"],
            "additionalProperties": false
        },
        "strict": true
    }
}

In your docs, I don't see this coupling between function calling schema and json_schema response format called out (if it is there, I am obviously missing it).

The docs say:

Structured Outputs is available in two forms in the OpenAI API:

- When using [function calling](https://platform.openai.com/docs/guides/function-calling)
- When using a json_schema response format

This makes it seem like they're able to be used independently.

As an additional note: in .Net, I can use the OpenAI library and make a call to the normal chat completions endpoint, configure the proper strict JSON Schema for the json_schema response format, and not need to manipulate the function calling JSON schema to include strict or additionalParameters and the calls work fine. No 400s encountered. Something like this:

chatCompletion = (await RunRequestAsync(() => this.Client!.GetChatClient(targetModel).CompleteChatAsync(chatForRequest, chatOptions, cancellationToken)).ConfigureAwait(false)).Value;

To Reproduce

  1. Use the latest OpenAI package
  2. Configure a Pydantic model as the response_format
  3. Include a tool (with non-strict JSON Schema) with the payload
  4. Make a call to client.beta.chat.completions.parse(...)
  5. Observe the 400 due to the function calling schema missing the strict/additionalProperties keys/values.

Code snippets

No response

OS

MacOS

Python version

Python 3.12.5

Library version

openai 1.47.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions