-
Notifications
You must be signed in to change notification settings - Fork 889
Open
Labels
Description
What problem do you want to solve?
Add support for collecting additional token usage metadata from Google GenAI responses:
usage_metadata.thoughts_token_count(Gemini 2.5 thinking tokens)usage_metadata.cached_content_token_count
Describe the solution you'd like
- Collect
thoughts_token_countasgcp.gen_ai.usage.thoughts_token_count- Using GCP vendor prefix since not in OTEL standard
- Collect
cached_content_token_countasgen_ai.usage.cache_read.input_tokens- Using existing OTEL standard attribute
Describe alternatives you've considered
No response
Additional Context
Motivation
- Gemini 2.5 models expose thinking tokens for extended reasoning capabilities
- Cached tokens are important for cost tracking and performance analysis
- Users need visibility into all token types for accurate billing and monitoring
Current State
Currently, only prompt_token_count and candidates_token_count are collected:
https://github.com/open-telemetry/opentelemetry-python-contrib/blob/main/instrumentation-genai/opentelemetry-instrumentation-google-genai/src/opentelemetry/instrumentation/google_genai/generate_content.py#L425-L435
I'm willing to submit a PR for this feature. 👍🏻
Would you like to implement a fix?
Yes
Tip
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.
Reactions are currently unavailable