Open
Description
We currently support OpenAI chat completions via the /guards/{guardName}/openai/v1/chat/completions
endpoint. At face value, it looks like it we should also be able to support Azure's OpenAI hostings as well, but as discovered by one of our users in guardrails-ai/guardrails#1159, the endpoint constructed by the AzureOpenAI client differs from one constructed by the OpenAI client.
We should test this thoroughly to see what the differences are; e.g. endpoint, payload, headers, etc., and, if feasible, handle both cilents' requests with the same route logic by adding some additional path parameters or wildcards. Otherwise we will need to stand up a new endpoint specifically to support Azure's implementation.
Metadata
Metadata
Assignees
Labels
No labels