Skip to content

support for responses.create() with AzureOpenAI and AsyncAzureOpenAI #2280

Open
@NikGor

Description

@NikGor

Confirm this is a feature request for the Python library and not the underlying OpenAI API.

  • This is a feature request for the Python library

Describe the feature or improvement you're requesting

Hi OpenAI team 👋

We’ve noticed that, as of now, the AzureOpenAI and AsyncAzureOpenAI clients do not expose the .responses resource like the default OpenAI client does. This is a bit limiting, especially considering that Azure OpenAI has recently added support for the /openai/responses endpoint as part of the 2025-03-01-preview API version.

Currently, attempting to use:

client = AsyncAzureOpenAI(...)
await client.responses.create(...)

raises an AttributeError because .responses is not available on that class.

It would be great if .responses was added to AzureOpenAI and AsyncAzureOpenAI, similarly to how .chat.completions are exposed.

• Azure now supports /openai/responses endpoint
• model / deployment ID is passed in the JSON body, not in the path
• No need to add it to _deployments_endpoints
• Adding this would make the SDK consistent and easier to use for Azure users

Thanks in advance! Happy to contribute a PR if you’re open to it.

Best,
Nikolai

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Azurefor issues relating to the Azure OpenAI service

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions