Reduce token usage when running Review Notes#434
Conversation
… that context as we already have this called out in our system instructions
|
The following accounts have interacted with this PR and/or linked issues. I will continue to update these lists as activity occurs. You can also manually ask me to refresh this list by adding the If you're merging code through a pull request on GitHub, copy and paste the following into the bottom of the merge commit message. To understand the WordPress project's expectations around crediting contributors, please review the Contributor Attribution page in the Core Handbook. |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## develop #434 +/- ##
==========================================
Coverage 68.44% 68.44%
Complexity 846 846
==========================================
Files 56 56
Lines 4095 4095
==========================================
Hits 2803 2803
Misses 1292 1292
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
jeffpaul
left a comment
There was a problem hiding this comment.
Tested Review Notes with this change and things continue to work as expected
What?
Try to reduce the token usage when running Review Notes
Why?
It was reported that it can be easy to run into hourly token limits when running Review Notes. While some of this is unavoidable (for instance, if you have a really long post), we do control how much extra context we send with each request and we can reduce that a bit while still maintaining quality results.
How?
In testing on a fairly short article (12 paragraphs, 5 headings), here's the before and after token usage:
Use of AI Tools
None for the code in this PR. Did use Claude Code running Opus 4.6 to build a helper plugin to log token usage to the debug log
Testing Instructions
npm i && npm run buildChangelog Entry