Skip to content

Add azure_anthropic backend for Anthropic models on Azure AI Foundry #100

@timothyjlaurent

Description

@timothyjlaurent

Summary

Add support for an azure_anthropic backend that enables using Anthropic models (Claude) hosted on Azure AI Foundry.

Motivation

Azure AI Foundry / Azure AI Model Catalog now offers Anthropic models as a managed service. Organizations using Azure often need to route API calls through their Azure endpoints for compliance, billing, and network policy reasons. The existing anthropic backend only targets api.anthropic.com, and the existing azure_openai backend only works with OpenAI models on Azure.

Proposed Implementation

The Anthropic Python SDK already supports a base_url parameter, so the implementation is straightforward — a thin wrapper around the existing AnthropicClient that reads Azure Foundry config.

Follows the ANTHROPIC_FOUNDRY_* env-var convention used by Claude Code:

Variable Purpose
ANTHROPIC_FOUNDRY_API_KEY API key
ANTHROPIC_FOUNDRY_RESOURCE Azure resource name (derives base URL)
ANTHROPIC_FOUNDRY_BASE_URL Explicit base URL (overrides resource)

Usage:

export ANTHROPIC_FOUNDRY_API_KEY="your-key"
export ANTHROPIC_FOUNDRY_RESOURCE="ml-platform-openai-stg-useast-2"

rlm ask . -q "Summarize this repo" --backend azure_anthropic --model claude-opus-4-6

Or via config:

backend: azure_anthropic
model: claude-opus-4-6
backend_kwargs:
  resource: ml-platform-openai-stg-useast-2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions