Conversation
|
Checks are failing. Will not request review until checks are succeeding. If you'd like to override that behavior, comment |
|
Assigning reviewers. If you would like to opt out of this review, comment R: @claudevdm for label python. Available commands:
The PR bot will only process comments in the main thread (not review comments). |
| translations.expand_sdf, | ||
| translations.expand_gbk, | ||
| translations.sink_flattens, | ||
| translations.fix_flatten_coders, |
There was a problem hiding this comment.
The fix_flatten_coders was added to solve yaml unit tests.
It also mimics the translations.standard_optimize_phases() used by Portable Runner
Without it, theYamlMappingTest::test_basic yields
apache_beam.testing.util.BeamAssertException: Failed assert: [
Row(label='11a', isogeny='a'), Row(label='37a', isogeny='a'), Row(label='389a', isogeny='a')] == [BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='11a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='37a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='389a', isogeny='a')],
unexpected elements [BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='11a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='37a', isogeny='a'), BeamSchema_ccf257cb_1966_410e_8157_00cd826e7392(label='389a', isogeny='a')],
missing elements [Row(label='11a', isogeny='a'), Row(label='37a', isogeny='a'), Row(label='389a', isogeny='a')
]
There was a problem hiding this comment.
It looks like the PreCommit YAML is still failing, can you please take a look>
There was a problem hiding this comment.
I think this maybe doesn't work because fix_flatten_coders assumes that the flattens will eventually be dealt with by sink_flattens. That may be what is causing the failures?
| translations.expand_gbk, | ||
| translations.sink_flattens, | ||
| translations.fix_flatten_coders, | ||
| # translations.sink_flattens, |
There was a problem hiding this comment.
To be clear, this PR isn't fixing the underlying issue, rather it is disabling the optimization?
Can we add a comment referencing why this is disabled, with reference to the bug?
Also I am not sure if the tradeoff between disabling this optimization is worth fixing this specific edge-case, maybe @damccorm has thoughts?
There was a problem hiding this comment.
I think it would be better to try to fix the underlying issue. This has the potential to introduce new encoding/decoding problems which it may be possible to generally avoid
|
Reminder, please take a look at this pr: @claudevdm @damccorm |
|
waiting on author |
|
This pull request has been marked as stale due to 60 days of inactivity. It will be closed in 1 week if no further activity occurs. If you think that’s incorrect or this pull request requires a review, please simply write any comment. If closed, you can revive the PR at any time and @mention a reviewer or discuss it on the dev@beam.apache.org list. Thank you for your contributions. |
|
This pull request has been closed due to lack of activity. If you think that is incorrect, or the pull request requires review, you can revive the PR at any time. |
Proposing a solution to #26190 .
It appears the Flatten was not setting a watermark, which caused following steps not to execute.
The issue was returning errors on beam versions 2.39 onwards, and it potentially produced unstable results before 2.39.
There are meaningful TODOs mentioned:
# TODO(robertwb): Possibly fuse multi-input flattens into one of the stages.# TODO(pabloem): Consider case where there are various producers