Skip to content

Conversation

@adiberk
Copy link

@adiberk adiberk commented Dec 18, 2025

Note

Manually manage the span in arun_stream to ensure proper context handling, error recording, and guaranteed span closure for async streaming.

  • Instrumentation (Agno):
    • Async streaming (_model_wrapper.arun_stream):
      • Replace start_as_current_span with manual start_span and explicit context attach/detach.
      • Add try/except/finally to record errors, set status, and always end() the span (handles cancellation/mismatched context).
      • Preserve setting of output attributes and extraction of token usage metrics from final streamed response.

Written by Cursor Bugbot for commit 86b0f30. This will update automatically on new commits. Configure here.

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Dec 18, 2025
@RogerHYang
Copy link
Contributor

Hi! Thanks for the PR. I'm trying to understand the specific issue this is meant to address. Could you please share:

  1. What specific bug or behavior you observed?
  2. A minimal reproduction case if possible?

This would help us evaluate whether the change is necessary and correct. Thanks!

Looking at the OpenTelemetry Python source, start_as_current_span is essentially just start_span() + use_span() combined:

@_agnosticcontextmanager
def start_as_current_span(self, ..., end_on_exit=True):
    span = self.start_span(...)
    with trace_api.use_span(span, end_on_exit=end_on_exit, ...):
        yield span

Comparing the two approaches, they appear to be functionally equivalent:

Aspect Original (start_as_current_span) PR (start_span + use_span)
Span creation start_span() internally start_span() explicitly
Context management use_span(end_on_exit=True) use_span(end_on_exit=False)
Span ending Via use_span's finally block Via outer finally: span.end()
Exception handling Via use_span's except block Via explicit except Exception block

While the structure differs, the end result is the same—both attach/detach context, end the span, and handle exceptions.

@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Dec 18, 2025
# Manually attach context (instead of using use_span context manager)
token = context_api.attach(trace_api.set_span_in_context(span))

try:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Span never ended if context attachment fails

The span is created on line 495, but the try block doesn't start until line 509. The context_api.attach() call on line 507 sits between span creation and the try block. If context_api.attach() or trace_api.set_span_in_context() throws an exception, the span will never be ended because the finally block containing span.end() won't execute. The try block should wrap both the context attachment and the span to ensure span.end() is always called.

Fix in Cursor Fix in Web

except Exception as e:
span.set_status(trace_api.StatusCode.ERROR, str(e))
span.record_exception(e)
raise
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: CancelledError not caught, span status remains OK

The exception handler catches only Exception, but asyncio.CancelledError inherits from BaseException in Python 3.8+. When an async streaming operation is cancelled, CancelledError won't be caught, the span status remains OK (set on line 510), and no exception is recorded. Other similar instrumentations in this repository (like autogen-agentchat) explicitly catch BaseException and handle GeneratorExit separately to properly record cancellation errors while allowing graceful generator cleanup.

Fix in Cursor Fix in Web

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:S This PR changes 10-29 lines, ignoring generated files.

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants