Skip to content

fix(streaming): handle Stream objects in reask handlers#1992

Merged
jxnl merged 2 commits into567-labs:mainfrom
thomasnormal:fix/streaming-reask-bug
Jan 13, 2026
Merged

fix(streaming): handle Stream objects in reask handlers#1992
jxnl merged 2 commits into567-labs:mainfrom
thomasnormal:fix/streaming-reask-bug

Conversation

@thomasnormal
Copy link
Contributor

@thomasnormal thomasnormal commented Jan 12, 2026

Summary

  • Fix crash when using streaming mode with max_retries > 1 and validation fails
  • Add checks in reask handlers to detect Stream objects and use simplified error messages
  • Add unit tests to prevent regression

Problem

When using streaming mode (create_partial or Partial[Model] with stream=True) combined with max_retries > 1, if validation fails during streaming, instructor crashes with:

'Stream' object has no attribute 'choices'

This happens because the reask handlers expect a ChatCompletion object but receive a Stream object when streaming is enabled.

Solution

Modified the reask handlers in:

  • instructor/providers/openai/utils.py
  • instructor/providers/anthropic/utils.py

Added checks to detect Stream/non-standard response objects and fall back to a simplified error message that doesn't try to access response content.

Test plan

  • Added unit tests in tests/test_streaming_reask_bug.py
  • Verified fix with real OpenAI API calls
  • Ran linter (ruff check) - all checks passed
  • Existing tests still pass

Fixes #1991

🤖 Generated with Claude Code

When using streaming mode with max_retries > 1, if validation fails,
the reask handlers would crash with "'Stream' object has no attribute
'choices'" because they expected a ChatCompletion but received a
Stream object.

This fix adds checks in the reask handlers for OpenAI and Anthropic
to detect Stream/non-standard response objects and fall back to a
simplified error message that doesn't try to access response content.

Fixes 567-labs#1991

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to 2c87556 in 53 seconds. Click for details.
  • Reviewed 329 lines of code in 3 files
  • Skipped 0 files when reviewing.
  • Skipped posting 7 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. instructor/providers/anthropic/utils.py:151
  • Draft comment:
    Good handling of non-Message (stream) responses. Consider extracting the repeated fallback error message string into a shared helper or constant to reduce duplication across the reask functions.
  • Reason this comment was not posted:
    Confidence changes required: 66% <= threshold 85% None
2. instructor/providers/anthropic/utils.py:212
  • Draft comment:
    Similarly, in reask_anthropic_json, the fallback error message is hard-coded. A helper to construct these error messages might improve consistency.
  • Reason this comment was not posted:
    Confidence changes required: 66% <= threshold 85% None
3. instructor/providers/openai/utils.py:22
  • Draft comment:
    The _is_stream_response helper has a clear docstring. Ensure that its logic (checking absence of 'choices') is in line with how all stream responses are expected to behave, particularly in functions that check for an 'output' attribute.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 85% None
4. instructor/providers/openai/utils.py:45
  • Draft comment:
    Consistent fallback logic for handling stream responses in reask functions; consider extracting the fallback error message text to a shared helper to follow DRY principles.
  • Reason this comment was not posted:
    Confidence changes required: 66% <= threshold 85% None
5. tests/test_streaming_reask_bug.py:18
  • Draft comment:
    The MockStream class effectively simulates a stream response without the 'choices' attribute. Tests are well structured to cover both stream and None responses.
  • Reason this comment was not posted:
    Confidence changes required: 0% <= threshold 85% None
6. tests/test_streaming_reask_bug.py:146
  • Draft comment:
    The integration test using create_partial is a good regression test. Ensure that it continues to align with downstream API changes.
  • Reason this comment was not posted:
    Confidence changes required: 0% <= threshold 85% None
7. instructor/providers/openai/utils.py:89
  • Draft comment:
    In reask_responses_tools, the check for the 'output' attribute differs from other _is_stream_response checks. Verify that stream responses indeed never have 'output' and that this condition fully covers streaming cases.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 85% None

Workflow ID: wflow_h4R0S3dAFZL7zGfH

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@thomasnormal thomasnormal changed the title fix(streaming): handle Stream objects in reask handlers fix(streaming): skip model validators during partial streaming Jan 13, 2026
@thomasnormal thomasnormal force-pushed the fix/streaming-reask-bug branch from 53565e2 to 2c87556 Compare January 13, 2026 13:52
@thomasnormal thomasnormal changed the title fix(streaming): skip model validators during partial streaming fix(streaming): handle Stream objects in reask handlers Jan 13, 2026
@jxnl jxnl merged commit bf9421b into 567-labs:main Jan 13, 2026
1 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: Streaming mode with max_retries > 1 crashes on validation failure

3 participants