-
Notifications
You must be signed in to change notification settings - Fork 31
pytest-benchmark for all elements #987
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Useful to attach more data at Python runtime to the `sim` object.
This set of tests is used for performance benchmarking of individual elements pushes (as micro-benchmarks). We use this file to rapidly evaluate performance changes when tuning beamline element performance on CPUs and GPUs.
|
|
||
| def test_ThinDipole(benchmark, sim): | ||
| el = elements.ThinDipole(name="kick", theta=0.45, rc=1.0) | ||
| benchmark.pedantic(el.push, setup=partial(pc_setup, sim), rounds=rounds) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could add a post-benchmark validation that:
- particles where not marked as invalid (e.g., turned out)
- no inf/nan is present in the pushed particle beam attributes
cemitch99
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggested a simplification of the beam energy/distribution inputs. I think it's ok to leave the element parameters unchanged (from the originating examples). Otherwise, it looks good to me.
Co-authored-by: Chad Mitchell <[email protected]>
|
ouch, because we did not release ImpactX yet, this relies on post 25.06 changes #993 in pyAMReX/WarpX -. |
This set of tests is used for performance benchmarking of individual elements pushes (as micro-benchmarks). We use this file to rapidly evaluate performance changes when tuning beamline element performance on CPUs and GPUs.
This integrates seamless with our pytest tests, e.g., to run only the benchmark tests (after
pip_install):python -m pytest tests/python/test_benchmark_elements.pyThere are additional command line flags for saving outputs and comparing them in the pytest-benchmark documentation.
Depends on:
AddParticles(pc & other)Broken in ImpactX AMReX-Codes/amrex#4498