-
-
Notifications
You must be signed in to change notification settings - Fork 172
Add model property to TokenUsage for cost calculation #1566
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: OskarStark <995707+OskarStark@users.noreply.github.com>
Co-authored-by: OskarStark <995707+OskarStark@users.noreply.github.com>
Co-authored-by: OskarStark <995707+OskarStark@users.noreply.github.com>
|
@copilot can you update the examples |
Co-authored-by: OskarStark <995707+OskarStark@users.noreply.github.com>
|
Add changelog entries to components @copilot |
Co-authored-by: OskarStark <995707+OskarStark@users.noreply.github.com>
|
@copilot it's for 0.4 for all components |
| 0.3 | ||
| --- | ||
|
|
||
| * Add `model` property to `TokenUsage` and `TokenUsageInterface` to track which AI model was used for token consumption |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.5 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.2 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
| 0.3 | ||
| --- | ||
|
|
||
| * Add model information to token usage extraction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@copilot for 0.4 section
|
@copilot $tokenUsage = $response->getMetadata()->get('token_usage'); returns a TokenUsageAggregation object, which contains a private array of TokenUsage objects. Each TokenUsage object should have a model property. There is also an issue with iterating over the TokenUsage objects, since the array is private. |
TokenUsage lacked model information, making it impossible to calculate costs accurately when using multiple models with different pricing (e.g., GPT-4o vs embeddings).
Changes
modelproperty andgetModel()methodprint_token_usage()function inexamples/bootstrap.phpto display the model property in all token usage examplesUsage
The
modelparameter is optional (nullable) for backward compatibility.Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.