Skip to content

Commit 23a9937

Browse files
authored
fix: preserve functionCall.id in Gemini conversation history conversion (#196)
## Summary One-line fix: preserve the original functionCall.id from Gemini conversation history instead of generating new IDs. ## Root Cause gemini.ts line 157 generated call_gemini_${name}_${i} IDs, discarding the original fixture-assigned IDs (e.g. call_fp_get_weather_001). On follow-up turns, toolCallId-based fixtures could not match because the ID was overwritten. Requests fell through to userMessage fixtures which returned another tool call, creating an infinite loop. Affects all Gemini/ADK showcase integrations. LangGraph-python (OpenAI format) was unaffected because it preserves IDs natively. ## Test plan - [x] Red-green test: verify functionCall.id is preserved in conversion - [x] Full test suite passes - [x] Format + lint clean
2 parents 495b2df + 679b9cb commit 23a9937

6 files changed

Lines changed: 56 additions & 6 deletions

File tree

.claude-plugin/marketplace.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
"source": {
1010
"source": "npm",
1111
"package": "@copilotkit/aimock",
12-
"version": "^1.23.0"
12+
"version": "^1.23.1"
1313
},
1414
"description": "Fixture authoring skill for @copilotkit/aimock — LLM, multimedia (image/TTS/transcription/video), MCP, A2A, AG-UI, vector, embeddings, structured output, sequential responses, streaming physics, record/replay, agent loop patterns, and debugging"
1515
}

.claude-plugin/plugin.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "aimock",
3-
"version": "1.23.0",
3+
"version": "1.23.1",
44
"description": "Fixture authoring guidance for @copilotkit/aimock — LLM, multimedia, MCP, A2A, AG-UI, vector, and service mocking",
55
"author": {
66
"name": "CopilotKit"

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,12 @@
22

33
## [Unreleased]
44

5+
## [1.23.1] - 2026-05-14
6+
7+
### Fixed
8+
9+
- **Gemini functionCall.id preservation** — the Gemini conversation history converter generated new tool call IDs (`call_gemini_*`) instead of preserving the original IDs from `functionCall.id`. This broke `toolCallId`-based fixture matching on follow-up turns: the follow-up fixture couldn't match because the ID was overwritten, so the request fell through to `userMessage` fixtures which returned another tool call — creating an infinite loop for all Gemini/ADK showcase integrations. LangGraph-python (OpenAI format) was unaffected because it preserves IDs natively. ([#196](https://github.com/CopilotKit/aimock/pull/196))
10+
511
## [1.23.0] - 2026-05-13
612

713
### Added

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "@copilotkit/aimock",
3-
"version": "1.23.0",
3+
"version": "1.23.1",
44
"description": "Mock infrastructure for AI application testing — LLM APIs, image generation, text-to-speech, transcription, audio generation, video generation, MCP tools, A2A agents, AG-UI event streams, vector databases, search, rerank, and moderation. One package, one port, zero dependencies.",
55
"license": "MIT",
66
"keywords": [

src/__tests__/gemini.test.ts

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -379,6 +379,50 @@ describe("geminiToCompletionRequest", () => {
379379
expect(result.messages[0].tool_call_id).not.toBe(result.messages[1].tool_call_id);
380380
});
381381

382+
it("preserves functionCall.id from Gemini conversation history", () => {
383+
const result = geminiToCompletionRequest(
384+
{
385+
contents: [
386+
{ role: "user", parts: [{ text: "weather in Tokyo" }] },
387+
{
388+
role: "model",
389+
parts: [
390+
{
391+
functionCall: {
392+
name: "get_weather",
393+
args: { location: "Tokyo" },
394+
id: "call_fp_get_weather_001",
395+
},
396+
},
397+
],
398+
},
399+
{
400+
role: "user",
401+
parts: [
402+
{
403+
functionResponse: {
404+
name: "get_weather",
405+
response: { temperature: 72 },
406+
},
407+
},
408+
],
409+
},
410+
{ role: "user", parts: [{ text: "thanks" }] },
411+
],
412+
},
413+
"gemini-2.0-flash",
414+
false,
415+
);
416+
417+
const assistantMsg = result.messages.find((m) => m.role === "assistant" && m.tool_calls);
418+
expect(assistantMsg).toBeDefined();
419+
expect(assistantMsg!.tool_calls![0].id).toBe("call_fp_get_weather_001");
420+
421+
const toolMsg = result.messages.find((m) => m.role === "tool");
422+
expect(toolMsg).toBeDefined();
423+
expect(toolMsg!.tool_call_id).toBe("call_fp_get_weather_001");
424+
});
425+
382426
it("aligns functionCall and functionResponse IDs across a round trip", () => {
383427
// Model turn: two functionCall parts
384428
const modelTurn = geminiToCompletionRequest(

src/gemini.ts

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,8 @@ import { proxyAndRecord } from "./recorder.js";
4646
interface GeminiPart {
4747
text?: string;
4848
thought?: boolean;
49-
functionCall?: { name: string; args: Record<string, unknown> };
50-
functionResponse?: { name: string; response: unknown };
49+
functionCall?: { name: string; args: Record<string, unknown>; id?: string };
50+
functionResponse?: { name: string; response: unknown; id?: string };
5151
inlineData?: { mimeType: string; data: string };
5252
}
5353

@@ -154,7 +154,7 @@ export function geminiToCompletionRequest(
154154
role: "assistant",
155155
content: text || null,
156156
tool_calls: funcCalls.map((fc, i) => ({
157-
id: `call_gemini_${fc.functionCall!.name}_${i}`,
157+
id: fc.functionCall!.id ?? `call_gemini_${fc.functionCall!.name}_${i}`,
158158
type: "function" as const,
159159
function: {
160160
name: fc.functionCall!.name,

0 commit comments

Comments
 (0)