refactor: Google Gen AI LLM event refactor#3748
Conversation
fb55422 to
1f389c5
Compare
Codecov Report✅ All modified and coverable lines are covered by tests.
Additional details and impacted files@@ Coverage Diff @@
## main #3748 +/- ##
==========================================
- Coverage 90.11% 81.42% -8.70%
==========================================
Files 450 440 -10
Lines 58464 57233 -1231
Branches 1 1
==========================================
- Hits 52687 46603 -6084
- Misses 5777 10630 +4853
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
There's no need to assert that response is defined when the function requires it as a parameter. I'd also avoid the cost of "is metadata defined" by:
const { usageMetadata } = response
if (Object.prototype.toString.call(usageMetadata) !== '[object Object]') {
return { promptTokens: 0, completionTokens: 0, totalTokens: 0 }
}| request, | ||
| response, | ||
| withError: !!err | ||
| error: !!err |
There was a problem hiding this comment.
err is an object, updated doc block to reflect
There was a problem hiding this comment.
The function signature and docblock indicates that response will always be defined. Is that correct?
| request, | ||
| response, | ||
| withError: !!err | ||
| error: !!err |
There was a problem hiding this comment.
err is an object, updated doc block to reflect
Description
Refactors Google Gen AI LLM events to use the new base
LlmEmbedding,LlmChatCompletionMessage, andLlmChatCompletionSummaryclasses.How to Test
Related Issues
Part of #3687