-
Notifications
You must be signed in to change notification settings - Fork 568
Description
Summary
Add support for an azure_anthropic backend that enables using Anthropic models (Claude) hosted on Azure AI Foundry.
Motivation
Azure AI Foundry / Azure AI Model Catalog now offers Anthropic models as a managed service. Organizations using Azure often need to route API calls through their Azure endpoints for compliance, billing, and network policy reasons. The existing anthropic backend only targets api.anthropic.com, and the existing azure_openai backend only works with OpenAI models on Azure.
Proposed Implementation
The Anthropic Python SDK already supports a base_url parameter, so the implementation is straightforward — a thin wrapper around the existing AnthropicClient that reads Azure Foundry config.
Follows the ANTHROPIC_FOUNDRY_* env-var convention used by Claude Code:
| Variable | Purpose |
|---|---|
ANTHROPIC_FOUNDRY_API_KEY |
API key |
ANTHROPIC_FOUNDRY_RESOURCE |
Azure resource name (derives base URL) |
ANTHROPIC_FOUNDRY_BASE_URL |
Explicit base URL (overrides resource) |
Usage:
export ANTHROPIC_FOUNDRY_API_KEY="your-key"
export ANTHROPIC_FOUNDRY_RESOURCE="ml-platform-openai-stg-useast-2"
rlm ask . -q "Summarize this repo" --backend azure_anthropic --model claude-opus-4-6Or via config:
backend: azure_anthropic
model: claude-opus-4-6
backend_kwargs:
resource: ml-platform-openai-stg-useast-2