Skip to content

Commit 1a02223

Browse files
authored
deepagents: update default model (#3575)
1 parent da0595f commit 1a02223

31 files changed

+297
-167
lines changed

src/code-samples/deepagents/content-builder.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -140,7 +140,7 @@ def load_subagents(config_path: Path) -> list:
140140
def create_content_writer():
141141
"""Create a content writer agent configured by filesystem files."""
142142
return create_deep_agent(
143-
model="openai:gpt-5.4",
143+
model="anthropic:claude-sonnet-4-6",
144144
memory=["./AGENTS.md"],
145145
skills=["./skills/"],
146146
tools=[generate_cover, generate_social_image],

src/langsmith/trace-deep-agents.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ def yearly_balance_schedule(
103103

104104

105105
agent = create_deep_agent(
106-
model="openai:gpt-5.4",
106+
model="google_genai:gemini-3.1-pro-preview",
107107
tools=[compute_compound_interest, yearly_balance_schedule],
108108
system_prompt=(
109109
"You are a careful assistant. "
@@ -239,7 +239,7 @@ def yearly_balance_schedule(
239239

240240

241241
agent = create_deep_agent(
242-
model="openai:gpt-5.4",
242+
model="google_genai:gemini-3.1-pro-preview",
243243
tools=[compute_compound_interest, yearly_balance_schedule],
244244
system_prompt=(
245245
"You are a careful assistant. "

src/oss/deepagents/acp.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ from deepagents_acp.server import AgentServerACP
5959

6060
async def main() -> None:
6161
agent = create_deep_agent(
62-
model="openai:gpt-5.4",
62+
model="google_genai:gemini-3.1-pro-preview",
6363
# You can customize your Deep Agent here: set a custom prompt,
6464
# add your own tools, attach middleware, or compose subagents.
6565
system_prompt="You are a helpful coding assistant",

src/oss/deepagents/async-subagents.mdx

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ async_subagents = [
6767
]
6868

6969
agent = create_deep_agent(
70-
model="claude-sonnet-4-6",
70+
model="google_genai:gemini-3.1-pro-preview",
7171
subagents=async_subagents,
7272
)
7373
```
@@ -101,7 +101,7 @@ const asyncSubagents: AsyncSubAgent[] = [
101101
];
102102

103103
const agent = createDeepAgent({
104-
model: "claude-sonnet-4-6",
104+
model: "google_genai:gemini-3.1-pro-preview",
105105
subagents: [...asyncSubagents],
106106
});
107107
```
@@ -348,7 +348,7 @@ When using LangGraph-based deployments, every async subagent run is a standard L
348348
:::python
349349
```python
350350
agent = create_deep_agent(
351-
model="claude-sonnet-4-6",
351+
model="google_genai:gemini-3.1-pro-preview",
352352
system_prompt="""...your instructions...
353353
354354
After launching an async subagent, ALWAYS return control to the user.
@@ -361,7 +361,7 @@ agent = create_deep_agent(
361361
:::js
362362
```typescript
363363
const agent = createDeepAgent({
364-
model: "claude-sonnet-4-6",
364+
model: "google_genai:gemini-3.1-pro-preview",
365365
systemPrompt: `...your instructions...
366366
367367
After launching an async subagent, ALWAYS return control to the user.

src/oss/deepagents/backends.mdx

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,11 @@ Here are a few prebuilt filesystem backends that you can quickly use with your d
4343

4444
| Built-in backend | Description |
4545
|---|---|
46-
| [Default](#statebackend-ephemeral) | `agent = create_deep_agent(model="openai:gpt-5.4")` <br></br> Ephemeral in state. The default filesystem backend for an agent is stored in `langgraph` state. Note that this filesystem only persists _for a single thread_. |
47-
| [Local filesystem persistence](#filesystembackend-local-disk) | `agent = create_deep_agent(model="openai:gpt-5.4", backend=FilesystemBackend(root_dir="/Users/nh/Desktop/"))` <br></br>This gives the deep agent access to your local machine's filesystem. You can specify the root directory that the agent has access to. Note that any provided `root_dir` must be an absolute path. |
48-
| [Durable store (LangGraph store)](#storebackend-langgraph-store) | `agent = create_deep_agent(model="openai:gpt-5.4", backend=StoreBackend())` <br></br>This gives the agent access to long-term storage that is _persisted across threads_. This is great for storing longer term memories or instructions that are applicable to the agent over multiple executions. |
49-
| [Sandbox](/oss/deepagents/sandboxes) | `agent = create_deep_agent(model="openai:gpt-5.4", backend=sandbox)` <br></br>Execute code in isolated environments. Sandboxes provide filesystem tools plus the `execute` tool for running shell commands. Choose from Modal, Daytona, Deno, or local VFS. |
50-
| [Local shell](#localshellbackend-local-shell) | `agent = create_deep_agent(model="openai:gpt-5.4", backend=LocalShellBackend(root_dir=".", env={"PATH": "/usr/bin:/bin"}))` <br></br>Filesystem and shell execution directly on the host. No isolation—use only in controlled development environments. See [security considerations](#localshellbackend-local-shell) below. |
46+
| [Default](#statebackend-ephemeral) | `agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview")` <br></br> Ephemeral in state. The default filesystem backend for an agent is stored in `langgraph` state. Note that this filesystem only persists _for a single thread_. |
47+
| [Local filesystem persistence](#filesystembackend-local-disk) | `agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=FilesystemBackend(root_dir="/Users/nh/Desktop/"))` <br></br>This gives the deep agent access to your local machine's filesystem. You can specify the root directory that the agent has access to. Note that any provided `root_dir` must be an absolute path. |
48+
| [Durable store (LangGraph store)](#storebackend-langgraph-store) | `agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=StoreBackend())` <br></br>This gives the agent access to long-term storage that is _persisted across threads_. This is great for storing longer term memories or instructions that are applicable to the agent over multiple executions. |
49+
| [Sandbox](/oss/deepagents/sandboxes) | `agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=sandbox)` <br></br>Execute code in isolated environments. Sandboxes provide filesystem tools plus the `execute` tool for running shell commands. Choose from Modal, Daytona, Deno, or local VFS. |
50+
| [Local shell](#localshellbackend-local-shell) | `agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=LocalShellBackend(root_dir=".", env={"PATH": "/usr/bin:/bin"}))` <br></br>Filesystem and shell execution directly on the host. No isolation—use only in controlled development environments. See [security considerations](#localshellbackend-local-shell) below. |
5151
| [Composite](#compositebackend-router) | Ephemeral by default, `/memories/` persisted. The Composite backend is maximally flexible. You can specify different routes in the filesystem to point towards different backends. See Composite routing below for a ready-to-paste example. |
5252

5353

@@ -369,7 +369,7 @@ from deepagents import create_deep_agent
369369
from deepagents.backends import CompositeBackend, StateBackend, FilesystemBackend
370370

371371
agent = create_deep_agent(
372-
model="openai:gpt-5.4",
372+
model="google_genai:gemini-3.1-pro-preview",
373373
backend=CompositeBackend(
374374
default=StateBackend(),
375375
routes={
@@ -555,7 +555,7 @@ Use [permissions](/oss/deepagents/permissions) to declaratively control which fi
555555
from deepagents import create_deep_agent, FilesystemPermission
556556

557557
agent = create_deep_agent(
558-
model="openai:gpt-5.4",
558+
model="google_genai:gemini-3.1-pro-preview",
559559
backend=CompositeBackend(
560560
default=StateBackend(),
561561
routes={
@@ -828,7 +828,7 @@ from deepagents import create_deep_agent
828828
from deepagents.backends import CompositeBackend, StateBackend, StoreBackend
829829

830830
agent = create_deep_agent(
831-
model="openai:gpt-5.4",
831+
model="google_genai:gemini-3.1-pro-preview",
832832
backend=lambda rt: CompositeBackend(
833833
default=StateBackend(rt),
834834
routes={"/memories/": StoreBackend(rt, namespace=lambda rt: (rt.server_info.user.identity,))},
@@ -837,7 +837,7 @@ agent = create_deep_agent(
837837

838838
# After
839839
agent = create_deep_agent(
840-
model="openai:gpt-5.4",
840+
model="google_genai:gemini-3.1-pro-preview",
841841
backend=CompositeBackend(
842842
default=StateBackend(),
843843
routes={"/memories/": StoreBackend(namespace=lambda rt: (rt.server_info.user.identity,))},

src/oss/deepagents/context-engineering.mdx

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Your custom system prompt is prepended to the built-in system prompt, which incl
5454
from deepagents import create_deep_agent
5555

5656
agent = create_deep_agent(
57-
model="claude-sonnet-4-6",
57+
model="google_genai:gemini-3.1-pro-preview",
5858
system_prompt=(
5959
"You are a research assistant specializing in scientific literature. "
6060
"Always cite sources. Use subagents for parallel research on different topics."
@@ -68,7 +68,7 @@ agent = create_deep_agent(
6868
import { createDeepAgent } from "deepagents";
6969

7070
const agent = await createDeepAgent({
71-
model: "claude-sonnet-4-6",
71+
model: "google_genai:gemini-3.1-pro-preview",
7272
systemPrompt: `You are a research assistant specializing in scientific literature.
7373
Always cite sources. Use subagents for parallel research on different topics.`,
7474
});
@@ -101,7 +101,7 @@ Memory files ([`AGENTS.md`](https://agents.md/)) provide persistent context that
101101
:::python
102102
```python
103103
agent = create_deep_agent(
104-
model="claude-sonnet-4-6",
104+
model="google_genai:gemini-3.1-pro-preview",
105105
memory=["/project/AGENTS.md", "~/.deepagents/preferences.md"],
106106
)
107107
```
@@ -110,7 +110,7 @@ agent = create_deep_agent(
110110
:::js
111111
```typescript
112112
const agent = await createDeepAgent({
113-
model: "claude-sonnet-4-6",
113+
model: "google_genai:gemini-3.1-pro-preview",
114114
memory: ["/project/AGENTS.md", "~/.deepagents/preferences.md"],
115115
});
116116
```
@@ -125,7 +125,7 @@ Skills provide **on-demand** capabilities. The agent reads frontmatter from each
125125
:::python
126126
```python
127127
agent = create_deep_agent(
128-
model="claude-sonnet-4-6",
128+
model="google_genai:gemini-3.1-pro-preview",
129129
skills=["/skills/research/", "/skills/web-search/"],
130130
)
131131
```
@@ -134,7 +134,7 @@ agent = create_deep_agent(
134134
:::js
135135
```typescript
136136
const agent = await createDeepAgent({
137-
model: "claude-sonnet-4-6",
137+
model: "google_genai:gemini-3.1-pro-preview",
138138
skills: ["/skills/research/", "/skills/web-search/"],
139139
});
140140
```
@@ -243,7 +243,7 @@ def fetch_user_data(query: str, runtime: ToolRuntime[Context]) -> str:
243243
return f"Data for user {user_id}: {query}"
244244

245245
agent = create_deep_agent(
246-
model="claude-sonnet-4-6",
246+
model="google_genai:gemini-3.1-pro-preview",
247247
tools=[fetch_user_data],
248248
context_schema=Context,
249249
)
@@ -284,7 +284,7 @@ const fetchUserData = tool(
284284
);
285285

286286
const agent = await createDeepAgent({
287-
model: "claude-sonnet-4-6",
287+
model: "google_genai:gemini-3.1-pro-preview",
288288
tools: [fetchUserData],
289289
contextSchema,
290290
});
@@ -365,7 +365,7 @@ from deepagents.middleware.summarization import (
365365

366366
backend = StateBackend # if using default backend
367367

368-
model = "openai:gpt-5.4"
368+
model = "google_genai:gemini-3.1-pro-preview"
369369
agent = create_deep_agent(
370370
model=model,
371371
middleware=[ # [!code highlight]
@@ -449,7 +449,7 @@ def make_backend(runtime):
449449
)
450450

451451
agent = create_deep_agent(
452-
model="claude-sonnet-4-6",
452+
model="google_genai:gemini-3.1-pro-preview",
453453
store=InMemoryStore(),
454454
backend=make_backend,
455455
system_prompt="""When users tell you their preferences, save them to
@@ -464,7 +464,7 @@ import { createDeepAgent, CompositeBackend, StateBackend, StoreBackend } from "d
464464
import { InMemoryStore } from "@langchain/langgraph-checkpoint";
465465

466466
const agent = await createDeepAgent({
467-
model: "claude-sonnet-4-6",
467+
model: "google_genai:gemini-3.1-pro-preview",
468468
store: new InMemoryStore(),
469469
backend: new CompositeBackend(
470470
new StateBackend(),

src/oss/deepagents/customization.mdx

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ from deepagents import create_deep_agent
119119

120120
agent = create_deep_agent(
121121
model=init_chat_model(
122-
model="claude-sonnet-4-6",
122+
model="google_genai:gemini-3.1-pro-preview",
123123
max_retries=10, # Increase for unreliable networks (default: 6)
124124
timeout=120, # Increase timeout for slow connections
125125
),
@@ -177,7 +177,7 @@ def internet_search(
177177
)
178178

179179
agent = create_deep_agent(
180-
model="openai:gpt-5.4",
180+
model="google_genai:gemini-3.1-pro-preview",
181181
tools=[internet_search]
182182
)
183183
```
@@ -248,7 +248,7 @@ thorough research, and then write a polished report. \
248248
"""
249249

250250
agent = create_deep_agent(
251-
model="openai:gpt-5.4",
251+
model="google_genai:gemini-3.1-pro-preview",
252252
system_prompt=research_instructions,
253253
)
254254
```
@@ -343,7 +343,7 @@ def log_tool_calls(request, handler):
343343

344344

345345
agent = create_deep_agent(
346-
model="openai:gpt-5.4",
346+
model="google_genai:gemini-3.1-pro-preview",
347347
tools=[get_weather],
348348
middleware=[log_tool_calls],
349349
)
@@ -394,7 +394,7 @@ const logToolCallsMiddleware = createMiddleware({
394394
});
395395

396396
const agent = await createDeepAgent({
397-
model: "anthropic:claude-sonnet-4-6",
397+
model: "google_genai:gemini-3.1-pro-preview",
398398
tools: [getWeather] as any,
399399
middleware: [logToolCallsMiddleware] as any,
400400
});
@@ -644,7 +644,7 @@ You can pass one or more file paths to the `memory` parameter when creating your
644644
checkpointer = MemorySaver()
645645

646646
agent = create_deep_agent(
647-
model="openai:gpt-5.4",
647+
model="google_genai:gemini-3.1-pro-preview",
648648
memory=[
649649
"/AGENTS.md"
650650
],
@@ -688,7 +688,7 @@ You can pass one or more file paths to the `memory` parameter when creating your
688688
)
689689

690690
agent = create_deep_agent(
691-
model="openai:gpt-5.4",
691+
model="google_genai:gemini-3.1-pro-preview",
692692
backend=StoreBackend(),
693693
store=store,
694694
memory=[
@@ -720,7 +720,7 @@ You can pass one or more file paths to the `memory` parameter when creating your
720720
checkpointer = MemorySaver()
721721

722722
agent = create_deep_agent(
723-
model="openai:gpt-5.4",
723+
model="google_genai:gemini-3.1-pro-preview",
724724
backend=FilesystemBackend(root_dir="/Users/user/{project}"),
725725
memory=[
726726
"./AGENTS.md"
@@ -912,7 +912,7 @@ class WeatherReport(BaseModel):
912912

913913

914914
agent = create_deep_agent(
915-
model="openai:gpt-5.4",
915+
model="google_genai:gemini-3.1-pro-preview",
916916
response_format=WeatherReport,
917917
tools=[internet_search]
918918
)
@@ -1013,3 +1013,4 @@ console.log(result.structuredResponse);
10131013
:::
10141014

10151015
For more information and examples, see [response format](/oss/langchain/structured-output#response-format).
1016+
n and examples, see [response format](/oss/langchain/structured-output#response-format).

src/oss/deepagents/data-analysis.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -264,7 +264,7 @@ from deepagents import create_deep_agent
264264
checkpointer = InMemorySaver()
265265

266266
agent = create_deep_agent(
267-
model="anthropic:claude-sonnet-4-5",
267+
model="google_genai:gemini-3.1-pro-preview",
268268
tools=[slack_send_message],
269269
backend=backend,
270270
checkpointer=checkpointer,

src/oss/deepagents/deploy.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ This creates the following files:
8383
| --- | --- |
8484
| `deepagents.toml` | Agent config — name, model, optional sandbox |
8585
| `AGENTS.md` | System prompt loaded at session start |
86-
| `.env` | API key template (`ANTHROPIC_API_KEY`, `LANGSMITH_API_KEY`, etc.) |
86+
| `.env` | API key template (`GOOGLE_API_KEY`, `LANGSMITH_API_KEY`, etc.) |
8787
| `mcp.json` | MCP server configuration (empty by default) |
8888
| `skills/` | Directory for [Agent Skills](https://agentskills.io/), with an example `review` skill |
8989

@@ -140,7 +140,7 @@ Core agent identity. For more on model selection and provider configuration, see
140140
```toml deepagents.toml
141141
[agent]
142142
name = "research-assistant"
143-
model = "anthropic:claude-sonnet-4-6"
143+
model = "google_genai:gemini-3.1-pro-preview"
144144
```
145145

146146
<Note>
@@ -151,7 +151,7 @@ Skills, MCP servers, and model dependencies are auto-detected from the project l
151151

152152
- **Skills**: the bundler recursively scans `skills/`, skipping hidden dotfiles, and bundles the rest.
153153
- **MCP servers**: if `mcp.json` exists, it is included in the deployment and [`langchain-mcp-adapters`](https://pypi.org/project/langchain-mcp-adapters/) is added as a dependency. Only HTTP/SSE transports are supported (stdio is rejected at bundle time).
154-
- **Model dependencies**: the `provider:` prefix in the `model` field determines the required `langchain-*` package (e.g., `anthropic` -> `langchain-anthropic`).
154+
- **Model dependencies**: the `provider:` prefix in the `model` field determines the required `langchain-*` package (e.g., `google_genai` -> `langchain-google-genai`).
155155
- **Sandbox dependencies**: the `[sandbox].provider` value maps to its partner package (e.g., `daytona` -> `langchain-daytona`).
156156

157157
### `[sandbox]`
@@ -222,15 +222,15 @@ A content writing agent that only needs a model and system prompt, with no code
222222
```toml deepagents.toml
223223
[agent]
224224
name = "deepagents-deploy-content-writer"
225-
model = "anthropic:claude-sonnet-4-6"
225+
model = "google_genai:gemini-3.1-pro-preview"
226226
```
227227

228228
A coding agent with a LangSmith sandbox for running code:
229229

230230
```toml deepagents.toml
231231
[agent]
232232
name = "deepagents-deploy-coding-agent"
233-
model = "anthropic:claude-sonnet-4-6"
233+
model = "google_genai:gemini-3.1-pro-preview"
234234

235235
[sandbox]
236236
provider = "langsmith"

src/oss/deepagents/frontend/overview.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ graph LR
4545
from deepagents import create_deep_agent
4646

4747
agent = create_deep_agent(
48-
model="openai:gpt-5.4",
48+
model="google_genai:gemini-3.1-pro-preview",
4949
tools=[get_weather],
5050
system_prompt="You are a helpful assistant",
5151
subagents=[

0 commit comments

Comments
 (0)