Skip to content

Commit a6a3958

Browse files
committed
fix: Cutted responses are properly killed and doesn't continue
See #398, #392.
1 parent 6d46126 commit a6a3958

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

app/helpers/call_llm.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -201,8 +201,8 @@ async def _response_callback(_retry: bool = False) -> None:
201201
)
202202
)
203203

204-
# Process the response and wait for latency metrics
205-
await _commit_answer(wait=False)
204+
# Process the response and wait for it to be able to kill the task if needed
205+
await _commit_answer(wait=True)
206206

207207
# First call
208208
if len(call.messages) <= 1:
@@ -215,7 +215,7 @@ async def _response_callback(_retry: bool = False) -> None:
215215
)
216216
# User is back
217217
else:
218-
# Welcome with the LLM, do not use the end call tool for the first message, LLM hallucinates it and this is extremely frustrating for the user
218+
# Welcome with the LLM, do not use the end call tool for the first message, LLM hallucinates it and this is extremely frustrating for the user, don't wait for the response to start the VAD quickly
219219
await _commit_answer(
220220
tool_blacklist={"end_call"},
221221
wait=False,

0 commit comments

Comments
 (0)