Skip to content

Conversation

@Dwij1704
Copy link
Collaborator

@Dwij1704 Dwij1704 commented Dec 23, 2025

Note

Introduces OpenInference instrumentation for the OpenAI Agents SDK with a processor that maps SDK traces/spans to OpenTelemetry using OpenInference semantics.

  • Adds @arizeai/openinference-instrumentation-openai-agents package exporting OpenAIAgentsInstrumentation and OpenInferenceTracingProcessor to register via sdk.addTraceProcessor() and emit AGENT/LLM/TOOL spans with attributes (messages, tools, usage, handoffs)
  • Includes examples (basic-usage, guardrails, handoffs, lifecycle-hooks, multi-turn, streaming, structured-output) and shared instrumentation.ts setup
  • Adds tests validating span creation, attributes (model, tokens, messages, tool calls), handoff graph (graph.node.id/parent_id), errors, and instrumentation wiring
  • Adds build/test config (package.json, tsconfig*.json, vitest.config.ts) and updates lockfiles

Written by Cursor Bugbot for commit af3d98d. This will update automatically on new commits. Configure here.

@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Dec 23, 2025
}

this._enabled = true;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

State flag prevents second instrumentation method call

The createProcessor() method sets _enabled = true, which causes instrument() to return early. This prevents using createProcessor() followed by manual registration, defeating the purpose of having two separate APIs. The _enabled flag should only be set when actually registering with the SDK via instrument(), not when merely creating a processor instance.

Additional Locations (1)

Fix in Cursor Fix in Web

@pkg-pr-new
Copy link

pkg-pr-new bot commented Dec 23, 2025

Open in StackBlitz

@arizeai/openinference-core

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-core@2564

@arizeai/openinference-genai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-genai@2564

@arizeai/openinference-instrumentation-anthropic

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-anthropic@2564

@arizeai/openinference-instrumentation-bedrock

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock@2564

@arizeai/openinference-instrumentation-bedrock-agent-runtime

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock-agent-runtime@2564

@arizeai/openinference-instrumentation-beeai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-beeai@2564

@arizeai/openinference-instrumentation-langchain

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain@2564

@arizeai/openinference-instrumentation-langchain-v0

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain-v0@2564

@arizeai/openinference-instrumentation-mcp

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-mcp@2564

@arizeai/openinference-instrumentation-openai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-openai@2564

@arizeai/openinference-instrumentation-openai-agents

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-openai-agents@2564

@arizeai/openinference-mastra

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@2564

@arizeai/openinference-semantic-conventions

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-semantic-conventions@2564

@arizeai/openinference-vercel

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-vercel@2564

commit: af3d98d

@Dwij1704 Dwij1704 changed the title feat: OpenAI agents sdk instrumentation feat: OpenAI agents sdk instrumentation Dec 23, 2025
}
}
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Off-by-one in handoff tracking memory cap

The reverseHandoffsDict size limit check occurs after adding a new entry and only removes one entry when the limit is exceeded. This allows the map to grow to MAX_HANDOFFS_IN_FLIGHT + 1 before cleanup, and when handoffs arrive faster than agent spans consume them, the cleanup rate becomes insufficient. The condition should check the size before adding or remove enough entries to stay under the limit.

Fix in Cursor Fix in Web

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants