-
Notifications
You must be signed in to change notification settings - Fork 92
fix: skip excluded backends before test backend setup #2656
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix: skip excluded backends before test backend setup #2656
Conversation
|
CI is currently failing during pytest session startup due to a DeprecationWarning This appears unrelated to the backend fixture change in this PR. |
|
We have other PRs waiting to go in (#2652). |
|
Got it, thanks! I’ll wait for that PR to land. |
f09a12a to
2ad69fe
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #2656 +/- ##
=======================================
Coverage 98.24% 98.24%
=======================================
Files 65 65
Lines 4220 4220
Branches 464 464
=======================================
Hits 4146 4146
Misses 45 45
Partials 29 29
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
Hi! All CI checks are passing, and thanks a lot for the initial review and approval. Thanks for your time! |
|
Hi @matthewfeickert 👋 Just a gentle follow-up on this PR. All CI checks are passing, Thanks a lot for your time and for maintaining the project! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @iamkrishpathak, but I don't think that this does what you expect it to do, and if it does, then I don't think that this is actually a helpful addition as it would skip most of the tests.
Example:
Let's start with a clean env
$ rm -rf .venv
$ uv venv
$ . .venv/bin/activate
$ uv pip install -e ".[all]" --group dev
if we then run with only_numpy we get
$ pytest -m only_numpy
================================================================================= test session starts ==================================================================================
platform linux -- Python 3.13.7, pytest-9.0.2, pluggy-1.6.0
Matplotlib: 3.10.8
Freetype: 2.6.1
benchmark: 5.2.3 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/feickert/Code/GitHub/scikit-hep/pyhf
configfile: pyproject.toml
testpaths: tests
plugins: requests-mock-1.12.1, mpl-0.18.0, console-scripts-1.4.1, anyio-4.12.0, socket-0.7.0, benchmark-5.2.3, mock-3.15.1
collected 1160 items / 1151 deselected / 9 selected
tests/test_pdf.py .ss.ss.ss [100%]
=============================================================================== short test summary info ================================================================================
SKIPPED [1] tests/test_pdf.py:169: skipping test_core_pdf_broadcasting[jax] as specified to only look at: numpy
SKIPPED [1] tests/test_pdf.py:169: skipping test_core_pdf_broadcasting[numpy_minuit] as specified to only look at: numpy
SKIPPED [1] tests/test_pdf.py:355: skipping test_pdf_integration_histosys[jax] as specified to only look at: numpy
SKIPPED [1] tests/test_pdf.py:355: skipping test_pdf_integration_histosys[numpy_minuit] as specified to only look at: numpy
SKIPPED [1] tests/test_pdf.py:501: skipping test_pdf_integration_shapesys[jax] as specified to only look at: numpy
SKIPPED [1] tests/test_pdf.py:501: skipping test_pdf_integration_shapesys[numpy_minuit] as specified to only look at: numpy
==================================================================== 3 passed, 6 skipped, 1151 deselected in 1.88s =====================================================================which is covering only 1 test file and skipping the bulk of the tests. I understand that it is selecting only the tests that are able to be explicitly labeled as numpy only, but testing only a very tiny segment of the codebase isn't something we want to encourage.
Maybe there's a difference in behavior between the version of pytest that you used and the latest version? My guess is that yes, pytest probably broke along the way and we didn't notice it (sorry if so!) and that we actually need to fix the behavior of the only selectors.
I understand the desire to avoid having to install jax to be able to run tests. Though using a method that excludes instead of including seems preferable.
If we're missing something here (very possible) please let us know.
|
Thanks a lot for the detailed explanation — this is very helpful. You’re absolutely right: running only 9 tests out of ~1160 is not something we should encourage, and I agree that the current behavior of My original intent was to lower the barrier for contributors who don’t have JAX installed, but I see now that including only explicitly-marked tests is the wrong model here. An exclusion-based approach (e.g. “run everything except backend X”) makes much more sense and better matches real test coverage expectations. Given your note about possible pytest/marker behavior drift, I suspect this might indeed be exposing a deeper issue with how the Before making changes, I’d like to confirm the preferred direction:
Happy to rework or pivot the PR once I understand the intended direction. Thanks again for taking the time to explain this so clearly. |
This PR updates the backend test fixture to respect only_* markers
before creating backend objects.
This avoids importing optional backends (e.g. JAX, PyTorch) when running
marker-restricted test suites such as
pytest -m only_numpy, improvingtest behavior in minimal environments.
Fixes #2654