Skip to content

ignore stale ai replies from older requests#116

Open
7se7en72025 wants to merge 1 commit intosugarlabs:mainfrom
7se7en72025:codex/calm-chat-no-stale-replies
Open

ignore stale ai replies from older requests#116
7se7en72025 wants to merge 1 commit intosugarlabs:mainfrom
7se7en72025:codex/calm-chat-no-stale-replies

Conversation

@7se7en72025
Copy link
Copy Markdown

issue

Chatbot responses are fetched asynchronously. If a user sends another prompt before the previous one returns, stale/older replies can still be spoken, causing out-of-order conversation.

fix

  • add monotonic chatbot request ids
  • add thread-safe helpers to mark/check latest request
  • gate face.say(...) so only the latest request is spoken
  • keep existing fallback order (LLM -> SLM -> brain)

impact

Prevents stale responses from talking over newer context, so chat flow stays correct and predictable under fast input or slow network.

@BhumiTalwar
Copy link
Copy Markdown
Contributor

Hi @chimosky, I’ve been looking over this PR. While it tries to solve the issue of stale replies, I'm concerned that the added complexity of thread locks and request IDs might not be necessary for our current needs. Please recheck this and let me know whether we should move forward or if there's a simpler way to handle asynchronous responses, such as using standard async timeouts or clearing the speech queue. Thanks!

@chimosky
Copy link
Copy Markdown
Member

chimosky commented Apr 28, 2026

Hi @chimosky, I’ve been looking over this PR. While it tries to solve the issue of stale replies, I'm concerned that the added complexity of thread locks and request IDs might not be necessary for our current needs. Please recheck this and let me know whether we should move forward or if there's a simpler way to handle asynchronous responses, such as using standard async timeouts or clearing the speech queue. Thanks!

I agree with you re it being unnecessary for our current needs, we don't have that much usage that we need to worry about this problem, locks might introduce unintended consequences.

One thing we can do is have the bot say something if a response is taking longer than expected.

That's a better tradeoff to make than introducing potential issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants