@@ -15,7 +15,7 @@ agents with specialized capabilities and tools. It features:
1515- ** 📦 Agent distribution** via Docker registry integration
1616- ** 🔒 Security-first design** with proper client scoping and resource isolation
1717- ** ⚡ Event-driven streaming** for real-time interactions
18- - ** 🧠 Multi-model support** (OpenAI, Anthropic, Gemini, [ Docker Model Runner (DMR)] ( https://docs.docker.com/ai/model-runner/ ) )
18+ - ** 🧠 Multi-model support** (OpenAI, Anthropic, Gemini, [ AWS Bedrock ] ( https://aws.amazon.com/bedrock/ ) , [ Docker Model Runner (DMR)] ( https://docs.docker.com/ai/model-runner/ ) )
1919
2020
2121## Why?
@@ -223,7 +223,7 @@ cagent run ./agent.yaml /analyze
223223
224224| Property | Type | Description | Required |
225225| ---------------------| ------------| ------------------------------------------------------------------------------| ----------|
226- | ` provider ` | string | Provider: ` openai ` , ` anthropic ` , ` google ` , ` dmr ` | ✓ |
226+ | ` provider ` | string | Provider: ` openai ` , ` anthropic ` , ` google ` , ` amazon-bedrock ` , ` dmr ` | ✓ |
227227| ` model ` | string | Model name (e.g., ` gpt-4o ` , ` claude-sonnet-4-0 ` , ` gemini-2.5-flash ` ) | ✓ |
228228| ` temperature ` | float | Randomness (0.0-1.0) | ✗ |
229229| ` max_tokens ` | integer | Response length limit | ✗ |
@@ -238,7 +238,7 @@ cagent run ./agent.yaml /analyze
238238``` yaml
239239models :
240240 model_name :
241- provider : string # Provider: openai, anthropic, google, dmr
241+ provider : string # Provider: openai, anthropic, google, amazon-bedrock, dmr
242242 model : string # Model name: gpt-4o, claude-3-7-sonnet-latest, gemini-2.5-flash, qwen3:4B, ...
243243 temperature : float # Randomness (0.0-1.0)
244244 max_tokens : integer # Response length limit
@@ -365,13 +365,117 @@ models:
365365 provider: google
366366 model: gemini-2.5-flash
367367
368+ # AWS Bedrock
369+ models:
370+ claude-bedrock:
371+ provider: amazon-bedrock
372+ model: global.anthropic.claude-sonnet-4-5-20250929-v1:0 # Global inference profile
373+
368374# Docker Model Runner (DMR)
369375models:
370376 qwen:
371377 provider: dmr
372378 model: ai/qwen3
373379` ` `
374380
381+ # ### AWS Bedrock provider usage
382+
383+ **Prerequisites:**
384+ - AWS account with Bedrock enabled in your region
385+ - Model access granted in the [Bedrock Console](https://console.aws.amazon.com/bedrock/) (some models require approval)
386+ - AWS credentials configured (see authentication below)
387+
388+ **Authentication:**
389+
390+ Bedrock supports two authentication methods :
391+
392+ **Option 1: Bedrock API key** (simplest)
393+
394+ Set the `AWS_BEARER_TOKEN_BEDROCK` environment variable with your Bedrock API key. You can customize the env var name using `token_key` :
395+
396+ ` ` ` yaml
397+ models:
398+ claude-bedrock:
399+ provider: amazon-bedrock
400+ model: global.anthropic.claude-sonnet-4-5-20250929-v1:0
401+ token_key: AWS_BEARER_TOKEN_BEDROCK # Name of env var containing your token (default)
402+ provider_opts:
403+ region: us-east-1
404+ ` ` `
405+
406+ Generate API keys in the [Bedrock Console](https://console.aws.amazon.com/bedrock/) under "API keys".
407+
408+ **Option 2: AWS credentials** (default)
409+
410+ Uses the [AWS SDK default credential chain](https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/configuring-sdk.html#specifying-credentials):
411+
412+ 1. Environment variables (`AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`)
413+ 2. Shared credentials file (`~/.aws/credentials`)
414+ 3. Shared config file (`~/.aws/config` with `AWS_PROFILE`)
415+ 4. IAM instance roles (EC2, ECS, Lambda)
416+
417+ You can also use `provider_opts.role_arn` for cross-account role assumption.
418+
419+ **Basic usage with AWS profile:**
420+
421+ ` ` ` yaml
422+ models:
423+ claude-bedrock:
424+ provider: amazon-bedrock
425+ model: global.anthropic.claude-sonnet-4-5-20250929-v1:0
426+ max_tokens: 64000
427+ provider_opts:
428+ profile: my-aws-profile
429+ region: us-east-1
430+ ` ` `
431+
432+ **With IAM role assumption:**
433+
434+ ` ` ` yaml
435+ models:
436+ claude-bedrock:
437+ provider: amazon-bedrock
438+ model: anthropic.claude-3-sonnet-20240229-v1:0
439+ provider_opts:
440+ role_arn: "arn:aws:iam::123456789012:role/BedrockAccessRole"
441+ external_id: "my-external-id"
442+ ` ` `
443+
444+ **provider_opts for Bedrock:**
445+
446+ | Option | Type | Description | Default |
447+ |--------|------|-------------|---------|
448+ | `region` | string | AWS region | us-east-1 |
449+ | `profile` | string | AWS profile name | (default chain) |
450+ | `role_arn` | string | IAM role ARN for assume role | (none) |
451+ | `role_session_name` | string | Session name for assumed role | cagent-bedrock-session |
452+ | `external_id` | string | External ID for role assumption | (none) |
453+ | `endpoint_url` | string | Custom endpoint (VPC/testing) | (none) |
454+
455+ **Supported models (via Converse API):**
456+
457+ All Bedrock models that support the Converse API work with cagent. Use inference profile IDs for best availability :
458+
459+ - **Anthropic Claude**: `global.anthropic.claude-sonnet-4-5-20250929-v1:0`, `us.anthropic.claude-haiku-4-5-20251001-v1:0`
460+ - **Amazon Nova**: `global.amazon.nova-2-lite-v1:0`
461+ - **Meta Llama**: `us.meta.llama3-2-90b-instruct-v1:0`
462+ - **Mistral**: `us.mistral.mistral-large-2407-v1:0`
463+
464+ **Inference profile prefixes:**
465+
466+ | Prefix | Routes to |
467+ |--------|-----------|
468+ | `global.` | All commercial AWS regions (recommended) |
469+ | `us.` | US regions only |
470+ | `eu.` | EU regions only (GDPR compliance) |
471+
472+ ` ` ` yaml
473+ models:
474+ claude-global:
475+ provider: amazon-bedrock
476+ model: global.anthropic.claude-sonnet-4-5-20250929-v1:0 # Routes to any available region
477+ ` ` `
478+
375479# ### DMR (Docker Model Runner) provider usage
376480
377481If `base_url` is omitted, Docker `cagent` will use `http://localhost:12434/engines/llama.cpp/v1` by default
@@ -427,7 +531,7 @@ Requirements and notes:
427531- Docker Model plugin must be available for auto-configure/auto-discovery
428532 - Verify with : ` docker model status --json`
429533- Configuration is best-effort; failures fall back to the default base URL
430- - ` provider_opts` currently apply to `dmr` and `anthropic ` providers
534+ - ` provider_opts` currently apply to `dmr`, `anthropic`, and `amazon-bedrock ` providers
431535- ` runtime_flags` are passed after `--` to the inference runtime (e.g., llama.cpp)
432536
433537Parameter mapping and precedence (DMR) :
0 commit comments