Skip to content

Commit f9224a0

Browse files
committed
Merge branch 'main' into claude/issue-1684-20250710_132451
2 parents 4747260 + d660753 commit f9224a0

File tree

12 files changed

+408
-108
lines changed

12 files changed

+408
-108
lines changed

.github/workflows/claude.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,4 +61,4 @@ jobs:
6161
# Optional: Custom environment variables for Claude
6262
# claude_env: |
6363
# NODE_ENV: test
64-
64+
allowed_tools: "Bash(git commit), Bash(uv sync), Bash(uv run)"

docs/integrations/bedrock.md

Lines changed: 52 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -57,26 +57,20 @@ Or configure using AWS CLI:
5757
aws configure
5858
```
5959

60-
### Sync Example
60+
## Sync Example
6161

6262
```python
6363
import boto3
6464
import instructor
6565
from pydantic import BaseModel
6666

67-
# Initialize the Bedrock client
6867
bedrock_client = boto3.client('bedrock-runtime')
69-
70-
# Enable instructor patches for Bedrock client
7168
client = instructor.from_bedrock(bedrock_client)
7269

73-
7470
class User(BaseModel):
7571
name: str
7672
age: int
7773

78-
79-
# Create structured output
8074
user = client.chat.completions.create(
8175
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
8276
messages=[
@@ -89,31 +83,32 @@ print(user)
8983
# > User(name='Jason', age=25)
9084
```
9185

92-
### Async Example
86+
## Async Example
87+
88+
> **Warning:**
89+
> AWS Bedrock's official SDK (`boto3`) does not support async natively. If you need to call Bedrock from async code, you can use `asyncio.to_thread` to run synchronous Bedrock calls in a non-blocking way.
9390
9491
```python
95-
import boto3
9692
import instructor
9793
from pydantic import BaseModel
9894
import asyncio
9995

100-
# Initialize the Bedrock client
101-
bedrock_client = boto3.client('bedrock-runtime')
102-
103-
# Enable instructor patches for async Bedrock client
104-
async_client = instructor.from_bedrock(bedrock_client, async_client=True)
96+
client = instructor.from_provider("bedrock/anthropic.claude-3-sonnet-20240229-v1:0")
10597

10698
class User(BaseModel):
10799
name: str
108100
age: int
109101

110-
async def get_user_async():
111-
return await async_client.chat.completions.create(
102+
def get_user():
103+
return client.chat.completions.create(
112104
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
113105
messages=[{"role": "user", "content": "Extract Jason is 25 years old"}],
114106
response_model=User,
115107
)
116108

109+
async def get_user_async():
110+
return await asyncio.to_thread(get_user)
111+
117112
user = asyncio.run(get_user_async())
118113
print(user)
119114
```
@@ -131,31 +126,60 @@ import instructor
131126
from instructor import Mode
132127
from pydantic import BaseModel
133128

134-
# Initialize the Bedrock client
135129
bedrock_client = boto3.client('bedrock-runtime')
136-
137-
# Enable instructor patches for Bedrock client with specific mode
138130
client = instructor.from_bedrock(bedrock_client, mode=Mode.BEDROCK_TOOLS)
139131

140-
141132
class User(BaseModel):
142133
name: str
143134
age: int
135+
```
144136

137+
## OpenAI Compatibility: Flexible Input Format and Model Parameter
145138

146-
# Create structured output
147-
user = client.chat.completions.create(
148-
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
149-
messages=[
150-
{"role": "user", "content": "Extract: Jason is 25 years old"},
151-
],
139+
Instructor’s Bedrock integration supports both OpenAI-style and Bedrock-native message formats, as well as any mix of the two. You can use either:
140+
141+
- **OpenAI-style**:
142+
`{"role": "user", "content": "Extract: Jason is 25 years old"}`
143+
144+
- **Bedrock-native**:
145+
`{"role": "user", "content": [{"text": "Extract: Jason is 25 years old"}]}`
146+
147+
- **Mixed**:
148+
You can freely mix OpenAI-style and Bedrock-native messages in the same request. The integration will automatically convert OpenAI-style messages to the correct Bedrock format, while preserving any Bedrock-native fields you provide.
149+
150+
This flexibility also applies to other keyword arguments, such as the model name:
151+
152+
- You can use either `model` (OpenAI-style) or `modelId` (Bedrock-native) as a keyword argument.
153+
- If you provide `model`, Instructor will automatically convert it to `modelId` for Bedrock.
154+
- If you provide both, `modelId` takes precedence.
155+
156+
**Example:**
157+
158+
```python
159+
import instructor
160+
161+
messages = [
162+
{"role": "system", "content": "Extract the name and age."}, # OpenAI-style
163+
{"role": "user", "content": [{"text": "Extract: Jason is 25 years old"}]}, # Bedrock-native
164+
{"role": "assistant", "content": "Sure! Jason is 25."}, # OpenAI-style
165+
]
166+
167+
# Both of these are valid:
168+
user = client.create(
169+
model="anthropic.claude-3-sonnet-20240229-v1:0", # OpenAI-style
170+
messages=messages,
152171
response_model=User,
153172
)
154173

155-
print(user)
156-
# > User(name='Jason', age=25)
174+
user = client.create(
175+
modelId="anthropic.claude-3-sonnet-20240229-v1:0", # Bedrock-native
176+
messages=messages,
177+
response_model=User,
178+
)
157179
```
158180

181+
All of the above will work seamlessly with Instructor’s Bedrock integration.
182+
159183
## Nested Objects
160184

161185
```python

0 commit comments

Comments
 (0)