Skip to content

[BUG] CrewAI only supports certain Bedrock models #4046

@kiran-darji-bg

Description

@kiran-darji-bg

Description

The following code produces the error: This model doesn't support the stopSequences field. Remove stopSequences and try again..

from crewai import LLM, Agent
llm = LLM(model="bedrock/openai.gpt-oss-safeguard-120b", drop_params=True, additional_drop_params=["stopSequences"])
print(llm.__dict__)
agent = Agent(
    llm=llm,
    role="You are the front desk concierge at a luxury hotel.",
    goal="Provide exceptional service to guests by anticipating their needs and exceeding their expectations.",
    backstory="You have years of experience in hospitality and a deep understanding of guest preferences.",
    verbosity=False,
    memory=False
)
print(agent.kickoff('hello'))

This error is raised by the Bedrock Boto3 client. After looking at the source code it is my understanding that CrewAI treats most, if not all, Bedrock models as the same, as in they can be called with the same arguments. The example works with model = "bedrock/amazon.nova-micro-v1:0" (without the need for drop params).

Steps to Reproduce

with

"crewai ~= 1.6.1",
"crewai[bedrock] ~= 1.6.1",

run

from crewai import LLM, Agent
llm = LLM(model="bedrock/openai.gpt-oss-safeguard-120b", drop_params=True, additional_drop_params=["stopSequences"])
print(llm.__dict__)
agent = Agent(
    llm=llm,
    role="You are the front desk concierge at a luxury hotel.",
    goal="Provide exceptional service to guests by anticipating their needs and exceeding their expectations.",
    backstory="You have years of experience in hospitality and a deep understanding of guest preferences.",
    verbosity=False,
    memory=False
)
print(agent.kickoff('hello'))

this produces an error
Switch LLM to model = "bedrock/amazon.nova-micro-v1:0" and the llm will respond successfully

Expected behavior

you should see LiteAgentOutput(raw='Hello! Welcome to our luxury hotel. My name ... printed

Screenshots/Code snippets

Image

Operating System

Other (specify in additional context)

Python Version

3.12

crewAI Version

1.6.1

crewAI Tools Version

NA

Virtual Environment

Venv

Evidence

Image Image

Possible Solution

Unsure exactly, but i see that drop params is not used in the BedrockCompletion class.

i had a play around and commented

if self.stop_sequences:
            config["stopSequences"] = self.stop_sequences

in BedrockCompletion class, but this got the agent caught in a loop in the CrewAgentExecutor. The stop sequence Observations is added as default for bedrock so maybe not relying on this can help?

Additional context

using MacOS tahoe, ran in notebook in vs code

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions