feat: add Azure Foundry provider support (OpenAI-compatible)#835
feat: add Azure Foundry provider support (OpenAI-compatible)#835VDurocher wants to merge 3 commits intoosaurus-ai:mainfrom
Conversation
1889447 to
96faffa
Compare
|
Hello @VDurocher, thank you for the contribution. Sadly it fails the CI, please fix your code and push the changes. Keep in mind I've rebased your changes on top of the main branch, you'll need to fetch the branch on your computer. |
|
Local status update: I prepared fixes for the failing test-core path (the broken model-id map in RemoteProviderService and the missing .azureOpenAI handling in RemoteProviderManager) and verified the exact OsaurusCoreTests CI-equivalent run locally.\n\nI attempted to push the update back to VDurocher/osaurus:feat/azure-foundry-provider, but GitHub rejected the maintainer push from my account with 403 Permission denied. The remaining blocker on this PR is branch write access, not local test coverage. |
|
Published a maintainer-owned replacement for this branch as #865 because the validated fix could not be pushed back to the contributor branch. The replacement is rebased onto current |
|
@copilot resolve the merge conflicts in this pull request |
|
Clean-PR rule note: this PR is still marked ready, but it has a failing |
Summary
Closes #555
Adds first-class support for Azure OpenAI / Azure AI Foundry as a remote provider. Azure uses the OpenAI-compatible
chat/completionsrequest/response format but differs in two key areas:api-key: <key>header instead ofAuthorization: Bearer <key>https://<resource>.openai.azure.com/openai/deployments/<deployment>/chat/completions?api-version=<version>— the deployment name replaces themodelfield in the URL pathChanges
RemoteProviderConfiguration.swiftcase azureOpenAItoRemoteProviderTypeenumRemoteProvider:azureDeploymentNameandazureAPIVersion(defaults to"2024-02-01")azureChatCompletionsURL()helper that constructs the full deployment URL withapi-versionquery paramazureModelsURL()helper for the/openai/modelsendpointresolvedHeaders()to injectapi-key: <key>for Azure (instead ofAuthorization: Bearer)decodeIfPresentfor full backward compatibility with existing config filesRemoteProviderService.swiftbuildURLRequest: routes Azure providers throughazureChatCompletionsURL()instead of the generichost + basePath + endpointpatternChatCompletionRequestformat (no conversion needed)parseResponseswitch: Azure reusesChatCompletionResponsedecoding (identical toopenaiLegacy)ChatCompletionChunkbranch — Azure's streaming format is identicalfetchModels: dispatches to newfetchAzureOpenAIModels(from:)which calls/openai/models?api-version=<version>ProviderPresets.swiftcase azurepreset with Azure blue brand gradient,cloud.fillSF Symbol, and help steps tailored to the Azure Portal workflowmatching(provider:)identifies Azure providers byproviderType == .azureOpenAI(host varies per resource, so string matching isn't reliable)Configuration example
myresource.openai.azure.comapi-keyheadergpt-4o-deployment2024-02-01(default) or any supported versionThe resulting chat completions URL:
Design decisions
ChatCompletionRequest/ChatCompletionResponse, so we reuse them entirely (DRY).azureDeploymentName,azureAPIVersion) usedecodeIfPresentso existing config files deserialise without errors.ChatCompletionChunkformat; no special handling needed.openaiLegacy.