-
Notifications
You must be signed in to change notification settings - Fork 266
Open
Description
** Please make sure you read the contribution guide and file the issues in the rigth place. **
Contribution guide.
Describe the bug
When using streaming mode, the Gemini Model does not return the usageMetadata in the LlmResponse. When stream=true the usageMetadata should return in the final aggregated response.
To Reproduce
Steps to reproduce the behavior:
- Create an
LlmAgentwith theGeminimodel. - In the
RunConfigset the streaming model toRunConfig.StreamingMode.SSE - observe the
LlmResponseinafterModelCallbackof a plugin or agent. - See that both for partial and final LLM responses,
usageMetadataisOptional.Empty
Expected behavior
The final and non-partial LlmResponse should include the usageMetadata.
Desktop (please complete the following information):
- OS: MacOS
- Java version:
- ADK version(see maven dependency): 0.5.0
Additional context
The python version of ADK returns the usageMetadata correctly.
The usageMetadata is returned from the Gemini API in the last chunk.
Metadata
Metadata
Assignees
Labels
No labels