fix(streaming): handle Stream objects in reask handlers#1992
Merged
jxnl merged 2 commits into567-labs:mainfrom Jan 13, 2026
Merged
fix(streaming): handle Stream objects in reask handlers#1992jxnl merged 2 commits into567-labs:mainfrom
jxnl merged 2 commits into567-labs:mainfrom
Conversation
When using streaming mode with max_retries > 1, if validation fails, the reask handlers would crash with "'Stream' object has no attribute 'choices'" because they expected a ChatCompletion but received a Stream object. This fix adds checks in the reask handlers for OpenAI and Anthropic to detect Stream/non-standard response objects and fall back to a simplified error message that doesn't try to access response content. Fixes 567-labs#1991 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Contributor
There was a problem hiding this comment.
Important
Looks good to me! 👍
Reviewed everything up to 2c87556 in 53 seconds. Click for details.
- Reviewed
329lines of code in3files - Skipped
0files when reviewing. - Skipped posting
7draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. instructor/providers/anthropic/utils.py:151
- Draft comment:
Good handling of non-Message (stream) responses. Consider extracting the repeated fallback error message string into a shared helper or constant to reduce duplication across the reask functions. - Reason this comment was not posted:
Confidence changes required:66%<= threshold85%None
2. instructor/providers/anthropic/utils.py:212
- Draft comment:
Similarly, in reask_anthropic_json, the fallback error message is hard-coded. A helper to construct these error messages might improve consistency. - Reason this comment was not posted:
Confidence changes required:66%<= threshold85%None
3. instructor/providers/openai/utils.py:22
- Draft comment:
The _is_stream_response helper has a clear docstring. Ensure that its logic (checking absence of 'choices') is in line with how all stream responses are expected to behave, particularly in functions that check for an 'output' attribute. - Reason this comment was not posted:
Confidence changes required:33%<= threshold85%None
4. instructor/providers/openai/utils.py:45
- Draft comment:
Consistent fallback logic for handling stream responses in reask functions; consider extracting the fallback error message text to a shared helper to follow DRY principles. - Reason this comment was not posted:
Confidence changes required:66%<= threshold85%None
5. tests/test_streaming_reask_bug.py:18
- Draft comment:
The MockStream class effectively simulates a stream response without the 'choices' attribute. Tests are well structured to cover both stream and None responses. - Reason this comment was not posted:
Confidence changes required:0%<= threshold85%None
6. tests/test_streaming_reask_bug.py:146
- Draft comment:
The integration test using create_partial is a good regression test. Ensure that it continues to align with downstream API changes. - Reason this comment was not posted:
Confidence changes required:0%<= threshold85%None
7. instructor/providers/openai/utils.py:89
- Draft comment:
In reask_responses_tools, the check for the 'output' attribute differs from other _is_stream_response checks. Verify that stream responses indeed never have 'output' and that this condition fully covers streaming cases. - Reason this comment was not posted:
Confidence changes required:33%<= threshold85%None
Workflow ID: wflow_h4R0S3dAFZL7zGfH
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
53565e2 to
2c87556
Compare
This was referenced Jan 16, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
max_retries > 1and validation failsProblem
When using streaming mode (
create_partialorPartial[Model]withstream=True) combined withmax_retries > 1, if validation fails during streaming, instructor crashes with:This happens because the reask handlers expect a
ChatCompletionobject but receive aStreamobject when streaming is enabled.Solution
Modified the reask handlers in:
instructor/providers/openai/utils.pyinstructor/providers/anthropic/utils.pyAdded checks to detect Stream/non-standard response objects and fall back to a simplified error message that doesn't try to access response content.
Test plan
tests/test_streaming_reask_bug.pyruff check) - all checks passedFixes #1991
🤖 Generated with Claude Code