Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions src/langsmith/evaluate-graph.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ async def main():
max_concurrency=4, # optional
experiment_prefix="claude-sonnet-4-6-baseline", # optional
metadata={ # optional, used to populate model/prompt/tool columns in UI
"models": "anthropic:claude-sonnet-4-6",
"models": "google_genai:gemini-3.1-pro-preview",
"tools": [{"name": "search", "description": "Call to surf the web."}],
},
)
Expand Down Expand Up @@ -215,7 +215,7 @@ async def main():
max_concurrency=4, # optional
experiment_prefix="claude-sonnet-4-6-baseline", # optional
metadata={ # optional, used to populate model/prompt/tool columns in UI
"models": "anthropic:claude-sonnet-4-6",
"models": "google_genai:gemini-3.1-pro-preview",
"tools": [{"name": "search", "description": "Call to surf the web."}],
},
)
Expand Down Expand Up @@ -246,7 +246,7 @@ async def main():
max_concurrency=4, # optional
experiment_prefix="claude-sonnet-4-6-baseline", # optional
metadata={ # optional, used to populate model/prompt/tool columns in UI
"models": "anthropic:claude-sonnet-4-6",
"models": "google_genai:gemini-3.1-pro-preview",
"tools": [{"name": "search", "description": "Call to surf the web."}],
},
)
Expand All @@ -268,7 +268,7 @@ async def main():
max_concurrency=4, # optional
experiment_prefix="claude-sonnet-4-6-model-node", # optional
metadata={ # optional, used to populate model/prompt/tool columns in UI
"models": "anthropic:claude-sonnet-4-6",
"models": "google_genai:gemini-3.1-pro-preview",
"tools": [{"name": "search", "description": "Call to surf the web."}],
},
)
Expand Down Expand Up @@ -432,7 +432,7 @@ async def main():
max_concurrency=4, # optional
experiment_prefix="claude-sonnet-4-6-baseline", # optional
metadata={ # optional, used to populate model/prompt/tool columns in UI
"models": "anthropic:claude-sonnet-4-6",
"models": "google_genai:gemini-3.1-pro-preview",
"tools": [{"name": "search", "description": "Call to surf the web."}],
},
)
Expand Down
6 changes: 3 additions & 3 deletions src/oss/concepts/providers-and-models.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -150,17 +150,17 @@ To find available model names for a provider, refer to the provider's own docume

## Use new models immediately

Because LangChain provider packages pass model names directly to the provider's API, you can use new models the moment a provider releases themno LangChain update required. Simply pass the new model name:
Because LangChain provider packages pass model names directly to the provider's API, you can use new models the moment a provider releases themno LangChain update required. Simply pass the new model name:

:::python
```python
model = init_chat_model("anthropic:claude-mythos")
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this one is intentionally a model that doesn't exist

model = init_chat_model("google_genai:gemini-mythos")
```
:::

:::js
```typescript
const model = await initChatModel("anthropic:claude-mythos");
const model = await initChatModel("google_genai:gemini-mythos");
```
:::

Expand Down
4 changes: 2 additions & 2 deletions src/oss/deepagents/cli/configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ Then override per-project where needed by placing a `.env` in the project direct
```toml
[models]
default = "ollama:qwen3:4b" # your intentional long-term preference
recent = "anthropic:claude-sonnet-4-5" # last /model switch (written automatically)
recent = "google_genai:gemini-3.1-pro-preview" # last /model switch (written automatically)
```

`[models].default` always takes priority over `[models].recent`. The `/model` command only writes to `[models].recent`, so your configured default is never overwritten by mid-session switches. To remove the default, use `/model --default --clear` or delete the `default` key from the config file.
Expand Down Expand Up @@ -223,7 +223,7 @@ To override model profile fields at runtime without editing the config file, pas
deepagents --profile-override '{"max_input_tokens": 4096}'

# Combine with --model
deepagents --model anthropic:claude-sonnet-4-5 --profile-override '{"max_input_tokens": 4096}'
deepagents --model google_genai:gemini-3.1-pro-preview --profile-override '{"max_input_tokens": 4096}'

# In non-interactive mode
deepagents -n "Summarize this repo" --profile-override '{"max_input_tokens": 4096}'
Expand Down
6 changes: 3 additions & 3 deletions src/oss/deepagents/cli/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -334,7 +334,7 @@ For full details on switching models, setting a default, and adding custom model
Pass extra model constructor parameters when switching mid-session using `--model-params`:

```txt
> /model --model-params '{"temperature": 0.7}' anthropic:claude-sonnet-4-5
> /model --model-params '{"temperature": 0.7}' anthropic:claude-opus-4-7
> /model --model-params '{"temperature": 0.7}' # opens selector, applies params to chosen model
```

Expand Down Expand Up @@ -842,8 +842,8 @@ When configured, the CLI displays a status line with a link to the LangSmith pro
deepagents --agent mybot

# Use a specific model (provider:model format or auto-detect)
deepagents --model anthropic:claude-sonnet-4-5
deepagents --model gpt-4o
deepagents --model anthropic:claude-opus-4-7
deepagents --model gpt-5.4

# Auto-approve tool usage (skip human-in-the-loop prompts)
deepagents -y
Expand Down
4 changes: 2 additions & 2 deletions src/oss/deepagents/customization.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,10 +86,10 @@ For the full parameter list, see the [`createDeepAgent`](https://reference.langc

## Model

Pass a `model` string in `provider:model` format, or an initialized model instance. Defaults to `anthropic:claude-sonnet-4-6`. See [supported models](/oss/deepagents/models#supported-models) for all providers and [suggested models](/oss/deepagents/models#suggested-models) for tested recommendations.
Pass a `model` string in `provider:model` format, or an initialized model instance. See [supported models](/oss/deepagents/models#supported-models) for all providers and [suggested models](/oss/deepagents/models#suggested-models) for tested recommendations.

<Tip>
Use the `provider:model` format (for example `openai:gpt-5`) to quickly switch between models.
Use the `provider:model` format (for example `openai:gpt-5.4`) to quickly switch between models.
</Tip>

:::python
Expand Down
8 changes: 4 additions & 4 deletions src/oss/deepagents/deploy.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ Core agent identity. For more on model selection and provider configuration, see
Name for the deployed agent. Used as the assistant identifier in LangSmith.
</ResponseField>

<ResponseField name="model" type="string" default="anthropic:claude-sonnet-4-6">
<ResponseField name="model" type="string">
Model identifier in `provider:model` format. See [supported models](/oss/deepagents/models#supported-models).
</ResponseField>

Expand Down Expand Up @@ -373,7 +373,7 @@ Each subagent subdirectory **may** contain:
[agent]
name = "researcher"
description = "Researches market trends, competitors, and target audiences"
model = "anthropic:claude-haiku-4-5-20251001"
model = "google_genai:gemini-3.1-pro-preview"
```

### Inheritance
Expand Down Expand Up @@ -404,14 +404,14 @@ A go-to-market agent that delegates research to a specialized subagent:
```toml deepagents.toml
[agent]
name = "gtm-strategist"
model = "anthropic:claude-sonnet-4-6"
model = "google_genai:gemini-3.1-pro-preview"
```

```toml subagents/researcher/deepagents.toml
[agent]
name = "researcher"
description = "Researches market trends, competitors, and target audiences to inform GTM strategy"
model = "anthropic:claude-haiku-4-5-20251001"
model = "google_genai:gemini-3.1-pro-preview"
```

```markdown subagents/researcher/AGENTS.md
Expand Down
4 changes: 2 additions & 2 deletions src/oss/deepagents/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This guide walks you through creating your first deep agent with planning, file

## Prerequisites

Before you begin, make sure you have an API key from a model provider (e.g., Anthropic, OpenAI).
Before you begin, make sure you have an API key from a model provider (e.g., Gemini, Anthropic, OpenAI).

<Note>
Deep Agents require a model that supports [tool calling](/oss/langchain/models#tool-calling). See [customization](/oss/deepagents/customization#model) for how to configure your model.
Expand Down Expand Up @@ -208,7 +208,7 @@ Use this to run an internet search for a given query. You can specify the max nu
"""
```

Pass a `model` string in `provider:model` format, or an initialized model instance. Defaults to `anthropic:claude-sonnet-4-6`. See [supported models](/oss/deepagents/models#supported-models) for all providers and [suggested models](/oss/deepagents/models#suggested-models) for tested recommendations.
Pass a `model` string in `provider:model` format, or an initialized model instance. See [supported models](/oss/deepagents/models#supported-models) for all providers and [suggested models](/oss/deepagents/models#suggested-models) for tested recommendations.

<Tabs>
<Tab title="Google">
Expand Down
4 changes: 2 additions & 2 deletions src/oss/langchain/agents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -895,7 +895,7 @@ from langchain.agents import create_agent
from langchain.messages import SystemMessage, HumanMessage

literary_agent = create_agent(
model="anthropic:claude-sonnet-4-5",
model="google_genai:gemini-3.1-pro-preview",
system_prompt=SystemMessage(
content=[
{
Expand Down Expand Up @@ -929,7 +929,7 @@ import { createAgent } from "langchain";
import { SystemMessage, HumanMessage } from "@langchain/core/messages";

const literaryAgent = createAgent({
model: "anthropic:claude-sonnet-4-5",
model: "google_genai:gemini-3.1-pro-preview",
systemPrompt: new SystemMessage({
content: [
{
Expand Down
2 changes: 1 addition & 1 deletion src/oss/langchain/frontend/integrations/copilotkit.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ const structuredOutputMiddleware = createMiddleware({
});

export const agent = createAgent({
model: process.env.COPILOTKIT_MODEL ?? "anthropic:claude-haiku-4-5",
model: process.env.COPILOTKIT_MODEL ?? "google_genai:gemini-3.1-pro-preview",
contextSchema,
middleware: [structuredOutputMiddleware],
tools: [searchWebTool, deepSearchTool],
Expand Down
2 changes: 1 addition & 1 deletion src/oss/langchain/mcp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -527,7 +527,7 @@ async with client.session("server_name") as session: # [!code highlight]
# Pass the session to load tools, resources, or prompts
tools = await load_mcp_tools(session) # [!code highlight]
agent = create_agent(
"anthropic:claude-3-7-sonnet-latest",
"google_genai:gemini-3.1-pro-preview",
tools
)
```
Expand Down
4 changes: 2 additions & 2 deletions src/oss/langchain/middleware/built-in.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1833,7 +1833,7 @@ const agent = createAgent({
</ParamField>

<ParamField body="model" type="string | BaseChatModel">
Model to use for generating emulated tool responses. Can be a model identifier string (e.g., `'anthropic:claude-sonnet-4-6'`) or a `BaseChatModel` instance. Defaults to the agent's model if not specified. See @[`init_chat_model`][init_chat_model(model)] for more information.
Model to use for generating emulated tool responses. Can be a model identifier string (e.g., `'google_genai:gemini-3.1-pro-preview'`) or a `BaseChatModel` instance. Defaults to the agent's model if not specified. See @[`init_chat_model`][init_chat_model(model)] for more information.
</ParamField>
:::

Expand All @@ -1843,7 +1843,7 @@ const agent = createAgent({
</ParamField>

<ParamField body="model" type="string | BaseChatModel">
Model to use for generating emulated tool responses. Can be a model identifier string (e.g., `'anthropic:claude-sonnet-4-6'`) or a `BaseChatModel` instance. Defaults to the agent's model if not specified.
Model to use for generating emulated tool responses. Can be a model identifier string (e.g., `'google_genai:gemini-3.1-pro-preview'`) or a `BaseChatModel` instance. Defaults to the agent's model if not specified.
</ParamField>
:::

Expand Down
2 changes: 1 addition & 1 deletion src/oss/langchain/middleware/custom.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1414,7 +1414,7 @@ const myOtherMiddleware = createMiddleware({
});

const agent = createAgent({
model: "anthropic:claude-3-5-sonnet",
model: "google_genai:gemini-3.1-pro-preview",
systemPrompt: "You are a helpful assistant.",
middleware: [myMiddleware, myOtherMiddleware],
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1035,7 +1035,7 @@ from langchain.chat_models import init_chat_model
from langchain.messages import HumanMessage, ToolMessage
from langchain.tools import tool, ToolRuntime

model = init_chat_model("anthropic:claude-3-5-sonnet-latest")
model = init_chat_model("google_genai:gemini-3.1-pro-preview")


# Define the possible workflow steps
Expand Down
8 changes: 4 additions & 4 deletions src/oss/langchain/multi-agent/handoffs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -478,13 +478,13 @@ def transfer_to_support(

# 3. Create agents with handoff tools
sales_agent = create_agent(
model="anthropic:claude-sonnet-4-20250514",
model="google_genai:gemini-3.1-pro-preview",
tools=[transfer_to_support],
system_prompt="You are a sales agent. Help with sales inquiries. If asked about technical issues or support, transfer to the support agent.",
)

support_agent = create_agent(
model="anthropic:claude-sonnet-4-20250514",
model="google_genai:gemini-3.1-pro-preview",
tools=[transfer_to_sales],
system_prompt="You are a support agent. Help with technical issues. If asked about pricing or purchasing, transfer to the sales agent.",
)
Expand Down Expand Up @@ -636,14 +636,14 @@ const transferToSupport = tool(

// 3. Create agents with handoff tools
const salesAgent = createAgent({
model: "anthropic:claude-sonnet-4-20250514",
model: "google_genai:gemini-3.1-pro-preview",
tools: [transferToSupport],
systemPrompt:
"You are a sales agent. Help with sales inquiries. If asked about technical issues or support, transfer to the support agent.",
});

const supportAgent = createAgent({
model: "anthropic:claude-sonnet-4-20250514",
model: "google_genai:gemini-3.1-pro-preview",
tools: [transferToSales],
systemPrompt:
"You are a support agent. Help with technical issues. If asked about pricing or purchasing, transfer to the sales agent.",
Expand Down
8 changes: 4 additions & 4 deletions src/oss/langchain/multi-agent/subagents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ from langchain.tools import tool
from langchain.agents import create_agent

# Create a subagent
subagent = create_agent(model="anthropic:claude-sonnet-4-20250514", tools=[...])
subagent = create_agent(model="google_genai:gemini-3.1-pro-preview", tools=[...])

# Wrap it as a tool
@tool("research", description="Research a topic and return findings")
Expand All @@ -60,7 +60,7 @@ def call_research_agent(query: str):
return result["messages"][-1].content

# Main agent with subagent as a tool
main_agent = create_agent(model="anthropic:claude-sonnet-4-20250514", tools=[call_research_agent])
main_agent = create_agent(model="google_genai:gemini-3.1-pro-preview", tools=[call_research_agent])
```
:::
:::js
Expand All @@ -69,7 +69,7 @@ import { createAgent, tool } from "langchain";
import { z } from "zod";

// Create a subagent
const subagent = createAgent({ model: "anthropic:claude-sonnet-4-20250514", tools: [...] });
const subagent = createAgent({ model: "google_genai:gemini-3.1-pro-preview", tools: [...] });

// Wrap it as a tool
const callResearchAgent = tool(
Expand All @@ -87,7 +87,7 @@ const callResearchAgent = tool(
);

// Main agent with subagent as a tool
const mainAgent = createAgent({ model: "anthropic:claude-sonnet-4-20250514", tools: [callResearchAgent] });
const mainAgent = createAgent({ model: "google_genai:gemini-3.1-pro-preview", tools: [callResearchAgent] });
```
:::

Expand Down
Loading
Loading