Skip to content

feat: wrappers for async responses methods.#4325

Open
eternalcuriouslearner wants to merge 6 commits intoopen-telemetry:mainfrom
eternalcuriouslearner:feat/wrappers-for-async-responses-methods
Open

feat: wrappers for async responses methods.#4325
eternalcuriouslearner wants to merge 6 commits intoopen-telemetry:mainfrom
eternalcuriouslearner:feat/wrappers-for-async-responses-methods

Conversation

@eternalcuriouslearner
Copy link

Description

Add async OpenAI Responses API stream wrappers for AsyncResponseStream and AsyncResponseStreamManager in the OpenAI v2 instrumentation, mirroring the existing sync wrapper behavior for telemetry finalization, failure handling, and wrapped response cleanup. This also extends the response wrapper unit coverage to include async manager entry/exit behavior and async stream __aexit__, close(), until_done(), response.aclose(), and get_final_response() flows.

This closes the parity gap between the sync and async Responses streaming wrapper implementations so the async SDK path can be instrumented with the same cleanup and error-handling guarantees. No new runtime dependencies were added; the change uses the existing OpenAI instrumentation package and test dependencies.

Fixes #3436 partly

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

  • Unit Tests

Does This PR Require a Core Repo Change?

  • Yes. - Link to PR:
  • No.

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added
  • Documentation has been updated

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Mar 11, 2026

CLA Signed

The committers listed above are authorized under a signed CLA.

@property
def response(self):
response = self.stream.response
response = self.stream._response
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this reliable given it's internal? Can we check that the property exists and gracefully skip instrumentation for streams otherwise?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is reliable but added a defensive guard to fallback to response if _response is unavailable.

return suppressed

def parse(self) -> "AsyncResponseStreamManagerWrapper[TextFormatT]":
raise NotImplementedError(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OTel instrumentations should not throw - we can't break user apps, we can probably just ignore this call?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is temporarily added based on previous pr feedback. I am going to remove this once I wire in instrumentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

Instrument OpenAI Responses API

3 participants