Description
Build a new plugin that allows customers to count the number of LLM tokens per recorded session. For each recorded request, check if it has a response, and if it's an LLM response that contains tokens information (sent and received). Calculate the total of tokens per type (prompt, completion, total). Group total per model. Show totals per type and a grand total for the session.
Initially, the plugin will support Azure AI Foundry URLs for groupings per model.
Originally proposed by @tomorgan