Open
Description
Feature Description
The documentation for OpenAI Assistants (https://sdk.vercel.ai/docs/ai-sdk-ui/openai-assistants) could use a bit more tweaking to account for real-world usage.
In this block
const tool_outputs =
runResult.required_action.submit_tool_outputs.tool_calls.map(
(toolCall: any) => {
const parameters = JSON.parse(toolCall.function.arguments);
switch (toolCall.function.name) {
// configure your tool calls here
default:
throw new Error(
`Unknown tool call function: ${toolCall.function.name}`,
);
}
},
);
This should really be a map with an async function, since 99% of the time we are reaching out to an outside API for tool calls. I wrap it with a Promise.all which I can then await and put the output below here:
const toolOutputs = await Promise.all(
runResult.required_action.submit_tool_outputs.tool_calls.map(
async (toolCall: any) => { // Ensure this is async
const parameters = JSON.parse(toolCall.function.arguments);
switch (toolCall.function.name) {
case "my_function":
{
return {
tool_call_id: toolCall.id,
output: JSON.stringify({ data: datastuff })
} as RunSubmitToolOutputsParams.ToolOutput;
}
default:
throw new Error(
`Unknown tool call function: ${toolCall.function.name}`,
);
}
},
)
);
runResult = await forwardStream(
openai.beta.threads.runs.submitToolOutputsStream(
threadId,
runResult.id,
{ tool_outputs: toolOutputs }, // Pass the resolved array of tool outputs here
),
);
And for the status:
const { status, ... } = useAssistant()
return(
<>
{status === "in_progress" ? <Spinner /> : null}
</>
)
This will completely disable streaming in the UI. I had to do something like this:
useEffect(() => {
const lastMessage = messages[messages.length - 1];
setInProgress(
!(status === "awaiting_message" || lastMessage?.role === "assistant")
);
}, [status, messages]);
Use Case
No response
Additional context
No response