-
Notifications
You must be signed in to change notification settings - Fork 78
Bug: tool call input-available state never rendered during streaming #221
Description
TL;DR I have tools which are long running and want to show the tool call inputs in the front end to show progress. This doesn't seem possible.
Below is an ai-generated summary. Please let me know if I can help in any capacity. I had to do some unholy things to get the tool call to show up amongst the stream.
Summary
useUIMessages never renders tool call parts with state: "input-available". The part only appears once the tool finishes, jumping straight to state: "output-available" with both input and output populated. For slow tools (e.g. 7 seconds), the user sees nothing until execution completes.
What I see
The tool call part only ever appears fully resolved:
{
"input": { "message": "Completing your request", "seconds": 3 },
"output": { "message": "Completing your request", "slept": 3 },
"state": "output-available",
"toolCallId": "be4ba54e8",
"type": "tool-sleep"
}No prior render shows state: "input-available" or state: "input-streaming". The part is absent from message.parts entirely until the tool finishes, then appears with both input and output populated.
Where in the code
The delta pipeline saves UIMessageChunks and the client reconstructs via readUIMessageStream. The underlying AI SDK emits tool-input-available and tool-output-available as separate chunks (toUIMessageStream in AI SDK — tool-call → tool-input-available, tool-result → tool-output-available), so the DeltaStreamer should persist them as separate deltas with a gap between them.
There's also a TODO in deltas.ts that acknowledges the problem:
// deriveUIMessagesFromDeltas — line ~109
const uiMessage = await updateFromUIMessageChunks(
blankUIMessage(streamMessage, threadId),
parts,
);
// TODO: this fails on partial tool calls
messages.push(uiMessage);The test suite (deltas.test.ts) only covers complete tool call sequences — every test includes tool-output-available alongside tool-input-available. No test exercises the intermediate state where input is available but output hasn't arrived yet.
I also suspect a race in useStreamingUIMessages.ts — the async effect can be aborted before setMessageState runs:
const abortController = new AbortController();
void (async () => {
const newMessageState = Object.fromEntries(
await Promise.all(
streams.map(async ({ deltas, streamMessage }) => {
// ... await updateFromUIMessageChunks(...)
}),
),
);
if (abortController.signal.aborted) return; // intermediate state dropped
setMessageState(newMessageState);
})();
return () => { abortController.abort(); };If a new delta (the tool-output-available) triggers a re-render while updateFromUIMessageChunks is still processing the previous batch, the intermediate state is computed but never set.
Reproduction
Backend — agent with a slow tool:
const sleepTool = tool({
description: "Sleep for N seconds then return a message.",
inputSchema: zodSchema(
z.object({ seconds: z.number(), message: z.string() }),
),
execute: async ({ seconds, message }) => {
await new Promise((resolve) => setTimeout(resolve, seconds * 1000));
return { slept: seconds, message };
},
});
const agent = new Agent(components.agent, {
name: "Test Agent",
languageModel: gateway("your-model"),
tools: { sleep: sleepTool },
instructions: "Always use the sleep tool with seconds=7.",
stopWhen: stepCountIs(3),
});
// internalAction: saveMessage, streamText with saveStreamDeltas: true, consumeStream
// authQuery: listUIMessages + syncStreamsFrontend — render parts raw:
const msgsResult = useUIMessages(api.dev.listDevMessages, { threadId, ... }, { stream: true });
const lastAssistant = messages?.filter(m => m.role === "assistant").at(-1);
// Render lastAssistant.parts as JSON — tool-sleep part only appears
// after 7 seconds, fully populated with both input and output.Workaround
We maintain a parallel pipeline that reads raw stream deltas via a custom query calling components.agent.streams.list + components.agent.streams.listDeltas, parses them into tool call objects with granular state tracking, and switches between the two sources. ~300 lines across 3 files.
Versions
@convex-dev/agent@0.3.2ai@5.0.123- Format:
UIMessageChunk(hardcoded inclient/streamText.ts)