-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Open
Description
Describe the bug
Sleeptime agent is never running for some of my agents.
Please describe your setup
-
How are you running Letta?
Official Docker 0.15.1 -
Describe your setup:
Railway, using official Docker image. Postgres as a separate DB instance.
Additional context
Sleeptime configured with default value (enable_sleeptime = true, created via API).
Main agent is letta_v1_agent, there are 10's of messages. Main agent set to 120K context window, there is quite a lot of content but agents with more do work as expected. Sleeptime at 30K but not full. Sleeptime agent chat in ADE is empty, not tool calls or activity.
- What model you are using
anthropic/claude-sonnet-4-5-20250929
Full log, when searching for agent id in the logs (all happens in the same second)
Letta.agent-4e5c3532-0171-412b-b99d-ba48a75d6de7 - WARNING - Error during step processing: Unhandled LLM error: Streaming is required for operations that may take longer than 10 minutes. See https://github.com/anthropics/anthropic-sdk-python#long-requests for more details
Letta.agent-4e5c3532-0171-412b-b99d-ba48a75d6de7 - INFO - Running final update. Step Progression: StepProgression.START
Letta.letta.services.run_manager - WARNING - Run run-5c05d5c3-cb01-4bae-add2-56167ff49d9f completed without a stop reason
Letta.letta.utils - ERROR - participant_agent_step_agent-4e5c3532-0171-412b-b99d-ba48a75d6de7 failed with LLMError: Unhandled LLM error: Streaming is required for operations that may take longer than 10 minutes. See https://github.com/anthropics/anthropic-sdk-python#long-requests for more details
Traceback (most recent call last):
File "/app/letta/adapters/simple_llm_request_adapter.py", line 42, in invoke_llm
self.response_data = await self.llm_client.request_async(request_data, self.llm_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/letta/otel/tracing.py", line 374, in async_wrapper
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/letta/llm_api/anthropic_client.py", line 127, in request_async
response = await client.beta.messages.create(**request_data, betas=betas)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.11/site-packages/anthropic/resources/beta/messages/messages.py", line 2649, in create
timeout = self._client._calculate_nonstreaming_timeout(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.11/site-packages/anthropic/_base_client.py", line 709, in _calculate_nonstreaming_timeout
raise ValueError(
ValueError: Streaming is required for operations that may take longer than 10 minutes. See https://github.com/anthropics/anthropic-sdk-python#long-requests for more details
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/letta/utils.py", line 1139, in wrapper
await coro
File "/app/letta/otel/tracing.py", line 374, in async_wrapper
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/letta/groups/sleeptime_multi_agent_v4.py", line 208, in _participant_agent_step
result = await sleeptime_agent.step(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/letta/otel/tracing.py", line 374, in async_wrapper
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/letta/agents/letta_agent_v3.py", line 139, in step
async for chunk in response:
File "/app/letta/agents/letta_agent_v3.py", line 701, in _step
raise e
File "/app/letta/agents/letta_agent_v3.py", line 598, in _step
raise e
File "/app/letta/agents/letta_agent_v3.py", line 586, in _step
async for chunk in invocation:
File "/app/letta/adapters/simple_llm_request_adapter.py", line 44, in invoke_llm
raise self.llm_client.handle_llm_error(e)
letta.errors.LLMError: Unhandled LLM error: Streaming is required for operations that may take longer than 10 minutes. See https://github.com/anthropics/anthropic-sdk-python#long-requests for more details
Metadata
Metadata
Assignees
Labels
No labels