Skip to content

Conversation

@lucidprogrammer
Copy link

Please describe the purpose of this pull request.
This PR adds support for Azure Anthropic endpoints by integrating the AnthropicFoundry client into the existing Anthropic provider logic.

Currently, letta assumes all Anthropic interaction occurs via the standard API. This change allows users to configure a custom endpoint (e.g., an Azure AI Foundry endpoint) via ANTHROPIC_API_BASE. When this environment variable is detected, the system:

  1. Dynamically switches to using AnthropicFoundry / AsyncAnthropicFoundry.
  2. Ensures API keys are passed explicitly (resolving authentication differences between standard and Foundry clients).
  3. Implements a fallback for list_llm_models_async as Azure/Foundry endpoints often do not support dynamic model listing (GET /models).
    4 Adds the Azure-specific model alias anthropic/claude-opus-4-5 to the supported model list.

How to test
Run the newly added test case which mocks the Foundry client initialization:

pytest tests/test_llm_clients.py -k test_anthropic_foundry_client_initialization

Manual Verification

Set ANTHROPIC_API_BASE in your .env to a valid Azure Anthropic endpoint. Set ANTHROPIC_API_KEY to your Azure key. Create a generic agent using an Azure-hosted model (e.g., anthropic/claude-opus-4-5).
Send a message and verify the response.

Following is the code I used for my testing the feature end to end

import os
import sys

try:
    from letta_client import Letta
    from letta_client.types import CreateBlockParam, MessageCreateParam
except ImportError as e:
    import traceback
    traceback.print_exc()
    print(f"Error importing letta_client: {e}")
    sys.exit(1)

def verify_azure_anthropic():
    # Connect to local server
    client = Letta(base_url="http://localhost:8283")
    
    print("Connected to Letta server at http://localhost:8283")

    # model handle: provider/model-name
    model = "anthropic/claude-opus-4-5"
    print(f"Creating agent with model: {model}")

    try:
        agent = client.agents.create(
            name="azure-test-agent",
            model=model,
            embedding="ollama/nomic-embed-text:latest", 
            memory_blocks=[
                CreateBlockParam(label="persona", value="You are a helpful AI assistant running on Azure Anthropic."),
                CreateBlockParam(label="human", value="User"),
            ]
        )
        print(f"Agent created successfully! ID: {agent.id}")
    except Exception as e:
        print(f"Failed to create agent: {e}")
        return

    # Send a message
    print("Sending test message...")
    try:
        response = client.agents.messages.create(
            agent_id=agent.id,
            messages=[MessageCreateParam(role="user", content="Hello! Are you working via Azure?")]
        )
        
        print("Response received:")
        # Check usage structure based on SDK version, usually strictly typed response
        if hasattr(response, 'messages'):
           for msg in response.messages:
               if hasattr(msg, 'content'):
                   print(f"Agent: {msg.content}")
               else:
                   print(f"Message: {msg}")
        else:
             print(response)

        print("\nVerification Successful!")
        
        # Cleanup
        print("Cleaning up agent...")
        client.agents.delete(agent_id=agent.id)

    except Exception as e:
        print(f"Error during chat: {e}")

if __name__ == "__main__":
    verify_azure_anthropic()

Have you tested this PR?

Unit Tests: Added test_anthropic_foundry_client_initialization to tests/test_llm_clients.py and verified it passes.
Manual Verification: Verified successful agent creation and message generation against a live Azure Anthropic endpoint using a custom verification script.

image

Related issues or PRs
None

Is your PR over 500 lines of code?
No

Additional context
None

@lucidprogrammer lucidprogrammer force-pushed the feature/azure-anthropic-support branch from fbf825c to f4d1da9 Compare December 18, 2025 09:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant