Agent output streams #1116
Replies: 11 comments
-
Interesting, curious to hear more about the use cases so we can get something that is great for those, anything comes to mind? |
Beta Was this translation helpful? Give feedback.
-
One use case would be to stream, agent result to a web frontend. |
Beta Was this translation helpful? Give feedback.
-
I agree, i'd like to send answers via WebSockets to a web client. |
Beta Was this translation helpful? Give feedback.
-
I also want to be able to stream live output to a frontend. Seems like it is also related to #146 |
Beta Was this translation helpful? Give feedback.
-
Currently I found a workaround to capture terminal output while crew.kickoff() is working with subprocess and to stream the stdout. It’s ugly and gets unsupported characters, but at least can stream. |
Beta Was this translation helpful? Give feedback.
-
please, share the code from io import StringIO # Python 3
import sys
# Create the in-memory "file"
temp_out = StringIO()
# Replace default stdout (terminal) with our stream
sys.stdout = temp_out
print("This is going in to the memory stream") Sometimes i see agents questions to one another in console, i'd really like to capture these too |
Beta Was this translation helpful? Give feedback.
-
Capture stdout would be a dream. Capturing chain of thought reasoning for audit logs and such. |
Beta Was this translation helpful? Give feedback.
-
Any news on this? |
Beta Was this translation helpful? Give feedback.
-
If CrewAI could add streaming outputs, it would be a breakthrough |
Beta Was this translation helpful? Give feedback.
-
I genuinely don’t want to use this: from typing import AsyncGenerator
import time
async def stream(text: str, chunk_size: int = 4, sleep: float = 0.03) -> AsyncGenerator:
"""Generator function to yield chunks of the final answer."""
for i in range(0, len(text), chunk_size):
time.sleep(sleep)
yield text[i : i + chunk_size]
text = "This is a test message. This is a piece of text that will be streamed. This is the final message."
async for chunk in stream(text):
print(chunk, end="") |
Beta Was this translation helpful? Give feedback.
-
+1 on this feature. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I just saw that AutoGen is planning to add an output streams feature beyond console output.
microsoft/autogen#1290 (comment)
This would be great if crewAI would provide a similar feature.
Beta Was this translation helpful? Give feedback.
All reactions