-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Closed
Description
Checked other resources
- I added a very descriptive title to this issue.
- I searched the LangChain.js documentation with the integrated search.
- I used the GitHub search to find a similar question and didn't find it.
- I am sure that this is a bug in LangChain.js rather than my code.
- The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
import { ChatVertexAI } from '@langchain/google-vertexai'
const chat = new ChatVertexAI({
model: 'claude-3-opus@20240229',
temperature: 0.0,
maxOutputTokens: 1200,
})
const result = await chat.invoke('Tell me a story')
console.log(result)Error Message and Stack Trace (if applicable)
Error: Unable to verify model params: {"lc":1,"type":"constructor","id":["langchain","chat_models","chat_integration","ChatVertexAI"],"kwargs":{"model":"claude-3-opus","temperature":0,"max_output_tokens":1200,"platform_type":"gcp"}}
at validateModelParams (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/utils/common.js:100:19)
at copyAndValidateModelParamsInto (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/utils/common.js:105:5)
at new ChatGoogleBase (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/chat_models.js:191:9)
at new ChatGoogle (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-gauth/dist/chat_models.js:12:9)
at new ChatVertexAI (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-vertexai/dist/chat_models.js:10:9)
at main (file:///Users/redacted/src/langchain-repro/test.js:5:22)
at file:///Users/redacted/src/langchain-repro/test.js:22:1
at ModuleJob.run (node:internal/modules/esm/module_job:218:25)
at async ModuleLoader.import (node:internal/modules/esm/loader:329:24)
at async loadESM (node:internal/process/esm_loader:28:7)
Description
I am trying to use ChatVertexAI with Anthropic Claude 3, but it seems this class only supports Gemini models and returns the above error message.
This appears to be a deliberate choice in the code:
langchainjs/libs/langchain-google-common/src/utils/common.ts
Lines 125 to 132 in b9d86b1
| switch (modelToFamily(model)) { | |
| case "gemini": | |
| return validateGeminiParams(testParams); | |
| default: | |
| throw new Error( | |
| `Unable to verify model params: ${JSON.stringify(params)}` | |
| ); | |
| } |
I've verified that using Claude and Vertex AI and Anthropics Vertex SDK works fine:
import { AnthropicVertex } from '@anthropic-ai/vertex-sdk'
const projectId = 'my-project-id'
const region = 'us-east5'
// Goes through the standard `google-auth-library` flow.
const client = new AnthropicVertex({
projectId,
region,
})
async function main() {
const result = await client.messages.create({
model: 'claude-3-opus@20240229',
max_tokens: 100,
messages: [
{
role: 'user',
content: 'Hey Claude!',
},
],
})
console.log(JSON.stringify(result, null, 2))
}
main()Output:
{
"id": "msg_vrtx_01M1yGR5LiteHznRmyK2MaPG",
"type": "message",
"role": "assistant",
"model": "claude-3-opus-20240229",
"stop_sequence": null,
"usage": {
"input_tokens": 10,
"output_tokens": 12
},
"content": [
{
"type": "text",
"text": "Hello! How can I assist you today?"
}
],
"stop_reason": "end_turn"
}System Info
"@anthropic-ai/vertex-sdk": "^0.3.5",
"@google-cloud/vertexai": "^1.1.0",
"@langchain/google-vertexai": "^0.0.11",
"langchain": "^0.1.35"
Platform: Mac
Node: 20.11.0
dosubot, ianwoodfill, younesbenallal, NikoLandgraf and obround
Metadata
Metadata
Assignees
Labels
No labels