Skip to content

feat: add Azure Foundry provider support (OpenAI-compatible)#835

Open
VDurocher wants to merge 3 commits intoosaurus-ai:mainfrom
VDurocher:feat/azure-foundry-provider
Open

feat: add Azure Foundry provider support (OpenAI-compatible)#835
VDurocher wants to merge 3 commits intoosaurus-ai:mainfrom
VDurocher:feat/azure-foundry-provider

Conversation

@VDurocher
Copy link
Copy Markdown
Contributor

Summary

Closes #555

Adds first-class support for Azure OpenAI / Azure AI Foundry as a remote provider. Azure uses the OpenAI-compatible chat/completions request/response format but differs in two key areas:

  • Authentication: api-key: <key> header instead of Authorization: Bearer <key>
  • URL structure: https://<resource>.openai.azure.com/openai/deployments/<deployment>/chat/completions?api-version=<version> — the deployment name replaces the model field in the URL path

Changes

RemoteProviderConfiguration.swift

  • Added case azureOpenAI to RemoteProviderType enum
  • Added two new optional fields on RemoteProvider: azureDeploymentName and azureAPIVersion (defaults to "2024-02-01")
  • Added azureChatCompletionsURL() helper that constructs the full deployment URL with api-version query param
  • Added azureModelsURL() helper for the /openai/models endpoint
  • Updated resolvedHeaders() to inject api-key: <key> for Azure (instead of Authorization: Bearer)
  • All new fields use decodeIfPresent for full backward compatibility with existing config files

RemoteProviderService.swift

  • buildURLRequest: routes Azure providers through azureChatCompletionsURL() instead of the generic host + basePath + endpoint pattern
  • Body encoding switch: Azure reuses the OpenAI ChatCompletionRequest format (no conversion needed)
  • parseResponse switch: Azure reuses ChatCompletionResponse decoding (identical to openaiLegacy)
  • Streaming SSE: falls through to the existing OpenAI ChatCompletionChunk branch — Azure's streaming format is identical
  • fetchModels: dispatches to new fetchAzureOpenAIModels(from:) which calls /openai/models?api-version=<version>

ProviderPresets.swift

  • Added case azure preset with Azure blue brand gradient, cloud.fill SF Symbol, and help steps tailored to the Azure Portal workflow
  • matching(provider:) identifies Azure providers by providerType == .azureOpenAI (host varies per resource, so string matching isn't reliable)

Configuration example

Field Value
Provider Type Azure OpenAI
Host myresource.openai.azure.com
Auth API Key → stored in Keychain, sent as api-key header
Deployment Name gpt-4o-deployment
API Version 2024-02-01 (default) or any supported version

The resulting chat completions URL:

https://myresource.openai.azure.com/openai/deployments/gpt-4o-deployment/chat/completions?api-version=2024-02-01

Design decisions

  • No new API models — Azure request/response format is 100% identical to OpenAI's ChatCompletionRequest / ChatCompletionResponse, so we reuse them entirely (DRY).
  • Backward compatible — new fields (azureDeploymentName, azureAPIVersion) use decodeIfPresent so existing config files deserialise without errors.
  • Streaming — Azure's SSE stream is byte-for-byte identical to OpenAI's ChatCompletionChunk format; no special handling needed.
  • Tool calls — fully supported, same as openaiLegacy.

@ritave ritave force-pushed the feat/azure-foundry-provider branch from 1889447 to 96faffa Compare April 14, 2026 12:52
@ritave
Copy link
Copy Markdown
Contributor

ritave commented Apr 14, 2026

Hello @VDurocher, thank you for the contribution.

Sadly it fails the CI, please fix your code and push the changes.

Keep in mind I've rebased your changes on top of the main branch, you'll need to fetch the branch on your computer.

@mimeding
Copy link
Copy Markdown
Contributor

Local status update: I prepared fixes for the failing test-core path (the broken model-id map in RemoteProviderService and the missing .azureOpenAI handling in RemoteProviderManager) and verified the exact OsaurusCoreTests CI-equivalent run locally.\n\nI attempted to push the update back to VDurocher/osaurus:feat/azure-foundry-provider, but GitHub rejected the maintainer push from my account with 403 Permission denied. The remaining blocker on this PR is branch write access, not local test coverage.

@mimeding
Copy link
Copy Markdown
Contributor

Published a maintainer-owned replacement for this branch as #865 because the validated fix could not be pushed back to the contributor branch. The replacement is rebased onto current main, preserves the Azure provider work, and has fresh local OsaurusCoreTests, OsaurusCLITests, swiftlint, and shellcheck validation.

@tpae
Copy link
Copy Markdown
Contributor

tpae commented Apr 17, 2026

@copilot resolve the merge conflicts in this pull request

@mimeding
Copy link
Copy Markdown
Contributor

Current CI root cause is a compile typo in RemoteProviderService.swift:

return modelsResponse.data.map { /usr/bin/bash.id }

This PR is also conflicting/dirty. I am treating #957 as the preferred Azure Foundry lane unless this branch has unique behavior that #957 does not cover.

@mimeding
Copy link
Copy Markdown
Contributor

Clean-PR rule note: this PR is still marked ready, but it has a failing test-core check and merge conflicts. I do not have permission to convert external PRs to draft from this account. Treat this as blocked/not mergeable until the branch is rebased and scripts/ci/check-pr-clean.sh osaurus-ai/osaurus 835 passes after #975 lands.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for Azure Foundry (openai compatibile)

4 participants