-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Description
Initial Checks
- I'm using the latest version of Pydantic AI
- I've searched for my issue in the issue tracker before opening this issue
Description
Bug Description
AGUIEventStream.handle_tool_call_delta receives a ToolCallPartDelta after handle_builtin_tool_call_end has already been called for the same tool_call_id. This causes a TOOL_CALL_ARGS SSE event to be emitted after TOOL_CALL_END for the same tool call, which violates the AG-UI protocol and crashes the @ag-ui/client verifier.
Observed SSE Output (from curling the backend)
TOOL_CALL_START (id: mcp_22b...)
TOOL_CALL_ARGS (id: mcp_22b..., delta: '{"action":"call_tool",...,"tool_args":')
TOOL_CALL_ARGS (id: mcp_22b..., delta: '}')
TOOL_CALL_END (id: mcp_22b...)
TOOL_CALL_ARGS (id: mcp_22b..., delta: '}') <- stray delta AFTER end
The final TOOL_CALL_ARGS event arrives after TOOL_CALL_END, which is invalid per the AG-UI protocol spec.
Steps to Reproduce
- Use
MCPServerToolwith the OpenAI Responses API model in streaming mode via the AG-UI endpoint. - Trigger a tool call that emits multiple argument deltas.
- Inspect the raw SSE stream (e.g. with
curl). - Observe a
TOOL_CALL_ARGSevent emitted afterTOOL_CALL_ENDfor the sametool_call_id.
Expected Behavior
No TOOL_CALL_ARGS events should be emitted for a tool_call_id after TOOL_CALL_END has been sent for that ID. The stream should be ordered: TOOL_CALL_START → one or more TOOL_CALL_ARGS → TOOL_CALL_END.
Actual Behavior
A stray TOOL_CALL_ARGS delta is emitted after TOOL_CALL_END, violating the AG-UI protocol and causing the @ag-ui/client verifier to crash/error.
Related Issues
- AG-UI: built-in tool results dropped on follow-up requests #4623 — builtin tool results dropped on follow-up AG-UI requests
- [Bug]: TOOL_CALL_START events suppressed for entire turn after tool result ag-ui-protocol/ag-ui#1275 — TOOL_CALL_START events suppressed after tool result
Neither of the above covers this exact scenario (stray post-end delta).
Environment
- Model: Internal OpenAI Responses API-compatible endpoint routing to custom inference (streaming mode); not the official OpenAI API
- Tool type:
MCPServerTool - Component:
AGUIEventStream(AG-UI integration)
Minimal, Reproducible Example
Logfire Trace
No response
Python, Pydantic AI & LLM client version
- Python: latest
- Pydantic AI: latest
- LLM provider SDK: openai (Responses API, streaming mode)