Skip to content

Conversation

@Dwij1704
Copy link
Collaborator

@Dwij1704 Dwij1704 commented Dec 12, 2025

Note

Introduces a new package that instruments the @google/genai SDK with OpenTelemetry/OpenInference spans, plus helper API, examples, and tests.

  • New Package: js/packages/openinference-instrumentation-google-genai
    • Provides OpenTelemetry/OpenInference instrumentation for @google/genai.
    • Helper: createInstrumentedGoogleGenAI to create and instrument instances.
  • Instrumentation:
    • Wraps ai.models.generateContent, generateContentStream, generateImages.
    • Wraps ai.chats.create and chat methods sendMessage, sendMessageStream.
    • Wraps ai.batches.createEmbeddings.
    • Captures OpenInference attributes (model, messages, tools, token usage; system vertexai, provider google).
    • Supports masking via traceConfig, context propagation, and tracing suppression.
  • Docs & Examples:
    • Added README.md, CHANGELOG.md, and example apps (chat.ts, streaming.ts, chat-session.ts, tools.ts, embeddings.ts, instrumentation.ts).
  • Testing & Build:
    • Vitest tests for spans and suppression; added tsconfigs and package scripts; lockfile updated.

Written by Cursor Bugbot for commit a8697a4. This will update automatically on new commits. Configure here.

@pkg-pr-new
Copy link

pkg-pr-new bot commented Dec 12, 2025

Open in StackBlitz

@arizeai/openinference-core

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-core@2515

@arizeai/openinference-genai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-genai@2515

@arizeai/openinference-instrumentation-anthropic

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-anthropic@2515

@arizeai/openinference-instrumentation-bedrock

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock@2515

@arizeai/openinference-instrumentation-bedrock-agent-runtime

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock-agent-runtime@2515

@arizeai/openinference-instrumentation-beeai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-beeai@2515

@arizeai/openinference-instrumentation-google-genai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-google-genai@2515

@arizeai/openinference-instrumentation-langchain

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain@2515

@arizeai/openinference-instrumentation-langchain-v0

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain-v0@2515

@arizeai/openinference-instrumentation-mcp

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-mcp@2515

@arizeai/openinference-instrumentation-openai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-openai@2515

@arizeai/openinference-mastra

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@2515

@arizeai/openinference-semantic-conventions

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-semantic-conventions@2515

@arizeai/openinference-vercel

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-vercel@2515

commit: a8697a4

@Dwij1704 Dwij1704 marked this pull request as ready for review December 12, 2025 03:33
@Dwij1704 Dwij1704 requested a review from a team as a code owner December 12, 2025 03:33
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Dec 12, 2025
yield chunk;
}
})();
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Streaming methods leave spans open on error

The generateContentStream and sendMessageStream methods lack error handling for failures during stream consumption. The for await loop that iterates over stream chunks has no try/catch or .catch() handler. If an error occurs mid-stream (network failure, API error), the span is never ended, causing a resource leak. Unlike generateContent which has a .catch() handler, these streaming methods only use .then(), so any thrown error during chunk iteration leaves the span open indefinitely.

Additional Locations (1)

Fix in Cursor Fix in Web

yield chunk;
}
})();
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Streaming methods buffer all chunks before yielding

The generateContentStream and sendMessageStream methods fully consume the original stream into a buffer before the promise resolves and yields any chunks back to the caller. When a user awaits these methods, they block until all chunks have been received from the API. The returned generator then yields from the buffer instantly. This defeats the purpose of streaming APIs, where users expect progressive output for real-time display. Users will see nothing until the entire response is complete, then get all chunks at once.

Additional Locations (1)

Fix in Cursor Fix in Web

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants