Replies: 1 comment
-
versionName: crewai class MyCustomListener(BaseEventListener):
def __init__(self):
super().__init__()
self.queue = asyncio.Queue()
def setup_listeners(self, crewai_event_bus):
@crewai_event_bus.on(LLMStreamChunkEvent)
async def on_llm_stream_chunk(event: LLMStreamChunkEvent):
await self.queue.put(event.chunk)
async def get_chunks(self):
while True:
try:
chunk = await self.queue.get()
if chunk == "[DONE]":
break
yield f"data: {chunk}\n\n"
except asyncio.CancelledError:
break
stream_handler = MyCustomListener()
crew.add_event_listener(stream_handler) I found LLMStreamChunkEvent on the official website. How can I make this event work? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
def ask_question_stream(question: str):
async_result = crew.kickoff(inputs={"question": question})
return async_result
app = FastAPI()
class QuestionRequest(BaseModel):
question: str
@app.post("/ask")
async def answer_question(request: QuestionRequest):
return ask_question_stream(request.question)
This is my current code, how should I proceed?
Beta Was this translation helpful? Give feedback.
All reactions