Area(s)
No response
What's missing?
Today all modern llm apis like OpenAI's Responses api or Anthropics Messages api provide an option to compact a conversation with llm. We need to add an attribute to identify both compaction threshold and compaction identifier when llm clients are enabling compaction on their llm api call.
Reference
- https://developers.openai.com/api/docs/guides/compaction
- https://platform.claude.com/docs/en/build-with-claude/compaction
Describe the solution you'd like
Add two new span attributes gen_ai.compaction.tokens or gen_ai.compaction.enabled to the LLMInvocation or Inference span
Tip
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.
Area(s)
No response
What's missing?
Today all modern llm apis like OpenAI's Responses api or Anthropics Messages api provide an option to compact a conversation with llm. We need to add an attribute to identify both compaction threshold and compaction identifier when llm clients are enabling compaction on their llm api call.
Reference
Describe the solution you'd like
Add two new span attributes gen_ai.compaction.tokens or gen_ai.compaction.enabled to the LLMInvocation or Inference span
Tip
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding
+1orme too, to help us triage it. Learn more here.