Skip to content

fix: update Azure Provider with openai models#212

Open
Valkea wants to merge 1 commit intoandrewyng:mainfrom
Valkea:valkea/azure_openai_fix
Open

fix: update Azure Provider with openai models#212
Valkea wants to merge 1 commit intoandrewyng:mainfrom
Valkea:valkea/azure_openai_fix

Conversation

@Valkea
Copy link

@Valkea Valkea commented Mar 4, 2025

The current implementation of the Azure provider works with some models but not with some others.

Some will use :

https://YOUR_DEPLOYMENT_NAME.YOUR_REGION_NAME.models.ai.azure.com/v1/chat/completions?api-version=YOUR_API_VERSION
headers = {"Content-Type": "application/json", "Authorization": self.api_key}

And some other will call:

https://YOUR_RESSOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME
headers = {"Content-Type": "application/json", "api-key": self.api_key}

So, I updated the provider to make it detect which one is used and prepare the parameters correctly.

        if ".openai.azure.com" in self.base_url:
            url = f"{self.base_url}/openai/deployments/{model}/chat/completions"
            headers = {"Content-Type": "application/json", "api-key": self.api_key}
        else:
            url = f"{self.base_url}/chat/completions"
            headers = {"Content-Type": "application/json", "Authorization": self.api_key}

@siddheshp
Copy link

siddheshp commented Dec 28, 2025

In Foundry, I see the endpoint URL as https://{resource-name}.cognitiveservices.azure.com/openai/deployments/gpt-4o-mini/chat/completions?api-version=2025-01-01-preview.

@Valkea can you add this too? seems this is the change for Foundry projects

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants