Skip to content

[bug/feat] Azure OpenAI support for proxy endpoint #86

Open
@CalebCourier

Description

@CalebCourier

We currently support OpenAI chat completions via the /guards/{guardName}/openai/v1/chat/completions endpoint. At face value, it looks like it we should also be able to support Azure's OpenAI hostings as well, but as discovered by one of our users in guardrails-ai/guardrails#1159, the endpoint constructed by the AzureOpenAI client differs from one constructed by the OpenAI client.

We should test this thoroughly to see what the differences are; e.g. endpoint, payload, headers, etc., and, if feasible, handle both cilents' requests with the same route logic by adding some additional path parameters or wildcards. Otherwise we will need to stand up a new endpoint specifically to support Azure's implementation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions