Skip to content

ChatBedrockConverse ignores temperature value with Claude 3.5 #450

Open
@DHall19357

Description

@DHall19357

Description

When using ChatBedrockConverse with claude sonnet 3.5 I am getting different responses with very low temp, this is not occurring when i am using a direct boto3 call with the same inference parameters.

Example Code

llm = ChatBedrockConverse(
  model="anthropic.claude-3-sonnet-20240229-v1:0", 
  temperature=0.01, 
  max_tokens=512, 
  top_p= 0.5
)
llm_with_retries = RunnableRetry(
    bound=llm, max_attempts=attempts, min_seconds=1, max_seconds=10, jitter=True
)
response = llm_with_retries.invoke(prompt)
parsed_output = llm_parsing(
    llm=llm, parser=parser, response_content=response.content
)
parsed_output

System Info

System Information

OS: Linux
OS Version: langchain-ai/langchain#1 SMP Fri Feb 14 16:52:40 UTC 2025
Python Version: 3.10.16 (main, Jan 13 2025, 14:07:23) [GCC 11.4.0]

Package Information

langchain_core: 0.3.37
langchain: 0.3.19
langchain_community: 0.3.18
langsmith: 0.3.8
langchain_aws: 0.2.13
langchain_text_splitters: 0.3.6

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions