When the OpenAI Responses API came out this summer with support for MCP servers, I initially looked at leveraging the AI ADK for this, but they only supported creating an experimental MCP client for each remote MCP server you wanted to use which mean generating n MCP clients for n MCP servers, and after that awaiting them to get the tools.
Because of that I went with the native OpenAI responses API call (#8)
https://github.com/pomerium/mcp-app-demo/blob/main/src/routes/api/chat.ts#L147-L161
answer = await client.responses.create({
instructions: systemPrompt,
model,
tools,
input: conversationHistory,
stream: true,
user: userId,
...(model.startsWith('o3') || model.startsWith('o4')
? {
reasoning: {
summary: 'detailed',
},
}
: {}),
})
It looks like the AI SDK supports this now, see vercel/ai#10026 . Let's move back to the AI SDK for this.
Just need to verify it supports the user field, and if not, I could open an issue for that/PR it up.
When the OpenAI Responses API came out this summer with support for MCP servers, I initially looked at leveraging the AI ADK for this, but they only supported creating an experimental MCP client for each remote MCP server you wanted to use which mean generating
nMCP clients fornMCP servers, and after that awaiting them to get the tools.Because of that I went with the native OpenAI responses API call (#8)
https://github.com/pomerium/mcp-app-demo/blob/main/src/routes/api/chat.ts#L147-L161
It looks like the AI SDK supports this now, see vercel/ai#10026 . Let's move back to the AI SDK for this.
Just need to verify it supports the
userfield, and if not, I could open an issue for that/PR it up.