-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fn Runner Watermark issue #34484
base: master
Are you sure you want to change the base?
Fn Runner Watermark issue #34484
Conversation
Checks are failing. Will not request review until checks are succeeding. If you'd like to override that behavior, comment |
Assigning reviewers. If you would like to opt out of this review, comment R: @claudevdm for label python. Available commands:
The PR bot will only process comments in the main thread (not review comments). |
@@ -388,7 +388,8 @@ def create_stages( | |||
translations.lift_combiners, | |||
translations.expand_sdf, | |||
translations.expand_gbk, | |||
translations.sink_flattens, | |||
translations.fix_flatten_coders, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why was this added?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The fix_flatten_coders
was added to solve yaml unit tests.
It also mimics the translations.standard_optimize_phases()
used by Portable Runner
Without it, theYamlMappingTest::test_basic
yields
apache_beam.testing.util.BeamAssertException: Failed assert: [
Row(label='11a', isogeny='a'), Row(label='37a', isogeny='a'), Row(label='389a', isogeny='a')] == [BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='11a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='37a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='389a', isogeny='a')],
unexpected elements [BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='11a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='37a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='389a', isogeny='a')],
missing elements [Row(label='11a', isogeny='a'), Row(label='37a', isogeny='a'), Row(label='389a', isogeny='a')
]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like the PreCommit YAML is still failing, can you please take a look>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this maybe doesn't work because fix_flatten_coders assumes that the flattens will eventually be dealt with by sink_flattens. That may be what is causing the failures?
@@ -388,7 +388,8 @@ def create_stages( | |||
translations.lift_combiners, | |||
translations.expand_sdf, | |||
translations.expand_gbk, | |||
translations.sink_flattens, | |||
translations.fix_flatten_coders, | |||
# translations.sink_flattens, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be clear, this PR isn't fixing the underlying issue, rather it is disabling the optimization?
Can we add a comment referencing why this is disabled, with reference to the bug?
Also I am not sure if the tradeoff between disabling this optimization is worth fixing this specific edge-case, maybe @damccorm has thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be better to try to fix the underlying issue. This has the potential to introduce new encoding/decoding problems which it may be possible to generally avoid
Proposing a solution to #26190 .
It appears the Flatten was not setting a watermark, which caused following steps not to execute.
The issue was returning errors on beam versions 2.39 onwards, and it potentially produced unstable results before 2.39.
There are meaningful TODOs mentioned:
# TODO(robertwb): Possibly fuse multi-input flattens into one of the stages.
# TODO(pabloem): Consider case where there are various producers