You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/oss/deepagents/backends.mdx
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,11 +43,11 @@ Here are a few prebuilt filesystem backends that you can quickly use with your d
43
43
44
44
| Built-in backend | Description |
45
45
|---|---|
46
-
|[Default](#statebackend-ephemeral)|`agent = create_deep_agent(model="openai:gpt-5.4")` <br></br> Ephemeral in state. The default filesystem backend for an agent is stored in `langgraph` state. Note that this filesystem only persists _for a single thread_. |
47
-
|[Local filesystem persistence](#filesystembackend-local-disk)|`agent = create_deep_agent(model="openai:gpt-5.4", backend=FilesystemBackend(root_dir="/Users/nh/Desktop/"))` <br></br>This gives the deep agent access to your local machine's filesystem. You can specify the root directory that the agent has access to. Note that any provided `root_dir` must be an absolute path. |
48
-
|[Durable store (LangGraph store)](#storebackend-langgraph-store)|`agent = create_deep_agent(model="openai:gpt-5.4", backend=StoreBackend())` <br></br>This gives the agent access to long-term storage that is _persisted across threads_. This is great for storing longer term memories or instructions that are applicable to the agent over multiple executions. |
49
-
|[Sandbox](/oss/deepagents/sandboxes)|`agent = create_deep_agent(model="openai:gpt-5.4", backend=sandbox)` <br></br>Execute code in isolated environments. Sandboxes provide filesystem tools plus the `execute` tool for running shell commands. Choose from Modal, Daytona, Deno, or local VFS. |
50
-
|[Local shell](#localshellbackend-local-shell)|`agent = create_deep_agent(model="openai:gpt-5.4", backend=LocalShellBackend(root_dir=".", env={"PATH": "/usr/bin:/bin"}))` <br></br>Filesystem and shell execution directly on the host. No isolation—use only in controlled development environments. See [security considerations](#localshellbackend-local-shell) below. |
46
+
|[Default](#statebackend-ephemeral)|`agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview")` <br></br> Ephemeral in state. The default filesystem backend for an agent is stored in `langgraph` state. Note that this filesystem only persists _for a single thread_. |
47
+
|[Local filesystem persistence](#filesystembackend-local-disk)|`agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=FilesystemBackend(root_dir="/Users/nh/Desktop/"))` <br></br>This gives the deep agent access to your local machine's filesystem. You can specify the root directory that the agent has access to. Note that any provided `root_dir` must be an absolute path. |
48
+
|[Durable store (LangGraph store)](#storebackend-langgraph-store)|`agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=StoreBackend())` <br></br>This gives the agent access to long-term storage that is _persisted across threads_. This is great for storing longer term memories or instructions that are applicable to the agent over multiple executions. |
49
+
|[Sandbox](/oss/deepagents/sandboxes)|`agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=sandbox)` <br></br>Execute code in isolated environments. Sandboxes provide filesystem tools plus the `execute` tool for running shell commands. Choose from Modal, Daytona, Deno, or local VFS. |
50
+
|[Local shell](#localshellbackend-local-shell)|`agent = create_deep_agent(model="google_genai:gemini-3.1-pro-preview", backend=LocalShellBackend(root_dir=".", env={"PATH": "/usr/bin:/bin"}))` <br></br>Filesystem and shell execution directly on the host. No isolation—use only in controlled development environments. See [security considerations](#localshellbackend-local-shell) below. |
51
51
|[Composite](#compositebackend-router)| Ephemeral by default, `/memories/` persisted. The Composite backend is maximally flexible. You can specify different routes in the filesystem to point towards different backends. See Composite routing below for a ready-to-paste example. |
52
52
53
53
@@ -369,7 +369,7 @@ from deepagents import create_deep_agent
369
369
from deepagents.backends import CompositeBackend, StateBackend, FilesystemBackend
370
370
371
371
agent = create_deep_agent(
372
-
model="openai:gpt-5.4",
372
+
model="google_genai:gemini-3.1-pro-preview",
373
373
backend=CompositeBackend(
374
374
default=StateBackend(),
375
375
routes={
@@ -555,7 +555,7 @@ Use [permissions](/oss/deepagents/permissions) to declaratively control which fi
555
555
from deepagents import create_deep_agent, FilesystemPermission
556
556
557
557
agent = create_deep_agent(
558
-
model="openai:gpt-5.4",
558
+
model="google_genai:gemini-3.1-pro-preview",
559
559
backend=CompositeBackend(
560
560
default=StateBackend(),
561
561
routes={
@@ -828,7 +828,7 @@ from deepagents import create_deep_agent
828
828
from deepagents.backends import CompositeBackend, StateBackend, StoreBackend
|`AGENTS.md`| System prompt loaded at session start |
86
-
|`.env`| API key template (`ANTHROPIC_API_KEY`, `LANGSMITH_API_KEY`, etc.) |
86
+
|`.env`| API key template (`GOOGLE_API_KEY`, `LANGSMITH_API_KEY`, etc.) |
87
87
|`mcp.json`| MCP server configuration (empty by default) |
88
88
|`skills/`| Directory for [Agent Skills](https://agentskills.io/), with an example `review` skill |
89
89
@@ -140,7 +140,7 @@ Core agent identity. For more on model selection and provider configuration, see
140
140
```toml deepagents.toml
141
141
[agent]
142
142
name = "research-assistant"
143
-
model = "anthropic:claude-sonnet-4-6"
143
+
model = "google_genai:gemini-3.1-pro-preview"
144
144
```
145
145
146
146
<Note>
@@ -151,7 +151,7 @@ Skills, MCP servers, and model dependencies are auto-detected from the project l
151
151
152
152
-**Skills**: the bundler recursively scans `skills/`, skipping hidden dotfiles, and bundles the rest.
153
153
-**MCP servers**: if `mcp.json` exists, it is included in the deployment and [`langchain-mcp-adapters`](https://pypi.org/project/langchain-mcp-adapters/) is added as a dependency. Only HTTP/SSE transports are supported (stdio is rejected at bundle time).
154
-
-**Model dependencies**: the `provider:` prefix in the `model` field determines the required `langchain-*` package (e.g., `anthropic` -> `langchain-anthropic`).
154
+
-**Model dependencies**: the `provider:` prefix in the `model` field determines the required `langchain-*` package (e.g., `google_genai` -> `langchain-google-genai`).
155
155
-**Sandbox dependencies**: the `[sandbox].provider` value maps to its partner package (e.g., `daytona` -> `langchain-daytona`).
156
156
157
157
### `[sandbox]`
@@ -222,15 +222,15 @@ A content writing agent that only needs a model and system prompt, with no code
222
222
```toml deepagents.toml
223
223
[agent]
224
224
name = "deepagents-deploy-content-writer"
225
-
model = "anthropic:claude-sonnet-4-6"
225
+
model = "google_genai:gemini-3.1-pro-preview"
226
226
```
227
227
228
228
A coding agent with a LangSmith sandbox for running code:
0 commit comments