Description
Description
I am still getting this error on latest version
crewai --version
crewai, version 0.120.1
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
Error during LLM call: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
I have tried LLM(model="o3", stop=None)
and
LLM(model="o3")
Steps to Reproduce
Set o3 from openAI in the LLM LLM(model="o3")
Expected behavior
I would get o3 reasoning model
Screenshots/Code snippets
LLM(model="o3")
Operating System
macOS Sonoma
Python Version
3.12
crewAI Version
0.120.1
crewAI Tools Version
0.120.1
Virtual Environment
Venv
Evidence
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
Error during LLM call: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
Traceback (most recent call last):
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 711, in completion
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 638, in completion
self.make_sync_openai_chat_completion_request(
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 457, in make_sync_openai_chat_completion_request
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 439, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/sentry_sdk/integrations/openai.py", line 277, in _sentry_patched_create_sync
return _execute_sync(f, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/sentry_sdk/integrations/openai.py", line 263, in _execute_sync
raise e from None
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/sentry_sdk/integrations/openai.py", line 260, in _execute_sync
result = f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/langtrace_python_sdk/instrumentation/openai/patch.py", line 381, in traced_method
result = wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1296, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 973, in request
return self._request(
^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1077, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/main.py", line 1692, in completion
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/main.py", line 1665, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 721, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/crewai/dev/agentic/my_content/.venv/bin/kickoff", line 10, in
sys.exit(kickoff())
^^^^^^^^^
File "/crewai/dev/agentic/my_content/src/my_content/main.py", line 101, in kickoff
action_flow.kickoff()
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/flow/flow.py", line 756, in kickoff
return asyncio.run(self.kickoff_async())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/runners.py", line 195, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/flow/flow.py", line 770, in kickoff_async
await asyncio.gather(*tasks)
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/flow/flow.py", line 802, in _execute_start_method
result = await self._execute_method(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/flow/flow.py", line 825, in _execute_method
else method(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/src/my_content/main.py", line 95, in my_content
self.state.my_content = MyContent().crew().kickoff(inputs=self.state.inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/langtrace_python_sdk/instrumentation/crewai/patch.py", line 91, in traced_method
result = wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/crew.py", line 576, in kickoff
result = self._run_sequential_process()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/crew.py", line 683, in _run_sequential_process
return self._execute_tasks(self.tasks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/crew.py", line 781, in _execute_tasks
task_output = task.execute_sync(
^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/langtrace_python_sdk/instrumentation/crewai/patch.py", line 91, in traced_method
result = wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/task.py", line 302, in execute_sync
return self._execute_core(agent, context, tools)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/task.py", line 366, in _execute_core
result = agent.execute_task(
^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/langtrace_python_sdk/instrumentation/crewai/patch.py", line 91, in traced_method
result = wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agent.py", line 254, in execute_task
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agent.py", line 243, in execute_task
result = self.agent_executor.invoke(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 112, in invoke
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 102, in invoke
formatted_answer = self._invoke_loop()
^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 160, in _invoke_loop
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 140, in _invoke_loop
answer = self._get_llm_response()
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 210, in _get_llm_response
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/agents/crew_agent_executor.py", line 201, in _get_llm_response
answer = self.llm.call(
^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/crewai/llm.py", line 291, in call
response = litellm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/langtrace_python_sdk/instrumentation/litellm/patch.py", line 291, in traced_method
result = wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1154, in wrapper
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1032, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/main.py", line 3068, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2201, in exception_type
raise e
File "/crewai/dev/agentic/my_content/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 326, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
Sentry is attempting to send 1 pending events
Waiting up to 2 seconds
Press Ctrl-C to quit
An error occurred while running the flow: Command '['uv', 'run', 'kickoff']' returned non-zero exit status 1.
Possible Solution
This issue #2738 said it implemented the LLM(model="o3", stop=None) but it does not appear to work
Additional context
It would be great to get this working as this also affect LLM(model="o4-mini", stop=None)