Skip to content

Review with suggested changes#12

Merged
gvarnavi merged 16 commits intodevfrom
uellue-patch-1
Jan 23, 2026
Merged

Review with suggested changes#12
gvarnavi merged 16 commits intodevfrom
uellue-patch-1

Conversation

@uellue
Copy link
Member

@uellue uellue commented Jan 21, 2026

Hi @gvarnavi, looking good! Just a few small things, most I could address myself in this PR as suggestions.

Closes #1

  • Would it be possible to make ../data/apoF_4mrad_1.5um-df_3A-step_30eA2_binary.npy available for download somehow, use another publicly available file, or simulate data on the fly e.g. with abTEM, so that the example in the README can easily be tried out? If I understood correctly, the method requires specific imaging conditions and matching parameters? We can also include such a file in https://libertem.github.io/LiberTEM/sample_datasets.html
  • It took me some time to figure out how to run the tests since I'm unfamiliar with using uv this way. I guess uv run pytest tests/ is the way? I've added that to CONTRIBUTORS.md, please adjust if necessary!
  • Small changes to README example
    • Use default Context(). I guess there's no particular reason to use the inline job executor?
    • Use "auto" dataset type, easier to adapt to other types
    • Consistent spaces
    • To be tested since I don't have the data!
  • Made a method protected that looked internal and at the same time inviting to users
  • Run tests both with and without Numba JIT to collect coverage
    • To be confirmed that coverage works as expected when run in CI! I confirmed that the environment variable is propagated to Python.
  • Use unequal shapes in test_utils_against_quantem.py to catch x/y mixups (hopefully)
    • A test fails now just barely above the error threshold. Is this numerical instability or a real issue? Are both legs run with float64 or does Torch default to float32? If confirmed numerical stability issue one can also just increase the threshold a bit, I guess?
  • test_suppress_nyquist_frequency() and test_kernel_weight_conservation() test with an (8, 8) shape. Is it possible to test with uneven shapes like in test_utils_against_quantem.py?

* Use default Context(). I guess there's no particular reason to use the inline job executor?
* Consistent spaces, flake8
@uellue uellue marked this pull request as draft January 21, 2026 12:04
@uellue uellue changed the title Some suggested tweaks in README Review with suggested changes Jan 21, 2026
From what I understood, users are supposed to call `from_parameters()` and not
use `preprocess_geometry()` directly, right? If not, what would be a use case
for calling it directly?

Since the signature is very similar to `from_parameters()`, IMO a good idea to mark this as internal
to avoid confusion and keep the public API compact.
Since `from_parameters()` is the designated entry point
and this is relevant for users, this
should probably be documented here?

FIXME at some point set up proper documentation, probably
Helps catch mix-ups between x and y

This introduces a test failure just barely above the threshold. Is this
numerical instability or a real difference?
@uellue uellue marked this pull request as ready for review January 21, 2026 16:28
@uellue uellue requested a review from gvarnavi January 21, 2026 16:28
Now that it is described below, I guess no longer needed? Should also
be easy enough to find when looking into the source code.

This makes it easier to change the implementation without breaking references
in the documentation.
@gvarnavi
Copy link
Collaborator

Thanks @uellue!

There was indeed a very subtle bug with flipped/transposed detector orientations and the implicit fftshift centering for even grids, which this expanded set of tests caught!

  • Would it be possible to make ../data/apoF_4mrad_1.5um-df_3A-step_30eA2_binary.npy available for download somehow, use another publicly available file, or simulate data on the fly e.g. with abTEM, so that the example in the README can easily be tried out? If I understood correctly, the method requires specific imaging conditions and matching parameters? We can also include such a file in https://libertem.github.io/LiberTEM/sample_datasets.html

Indeed, yes -- I can upload this somewhere more permanent like Zenodo tomorrow. Since it's very low dose and binary counts it's actually quite small (~80 Mb), so it could conceivably also live in repository. I will update the README / add a tutorial once I do that, and then we can merge this.

@gvarnavi
Copy link
Collaborator

Tiny dataset uploaded to Zenodo here: https://doi.org/10.5281/zenodo.18346853

Once you've gotten a chance to test @uellue, we can merge this and deploy?

Gives a nicer user experience :-)
@uellue
Copy link
Member Author

uellue commented Jan 23, 2026

There was indeed a very subtle bug with flipped/transposed detector orientations and the implicit fftshift centering for even grids, which this expanded set of tests caught!

Happy to hear that it helped to be a bit pedantic! Also stuff like 1/2 px offset from int(float) vs int(round(float)) is super annoying to get right. The worst are pixel coordinate values that fall exactly on a pixel boundary, so that it flip-flops between pixels depending on the mood of the day.

From my side this looks good!

Before release, maybe you can have a discussion with @sk1p about how to set up a release pipeline, not sure how far you are with this already? This is super super helpful with maintenance since releases can be minted with minimal effort.

@uellue
Copy link
Member Author

uellue commented Jan 23, 2026

...a godsend if upstream changes and breaks the package. Point release with a quick fix or version pin is a matter of minutes with such a pipeline.

@gvarnavi
Copy link
Collaborator

Before release, maybe you can have a discussion with @sk1p about how to set up a release pipeline, not sure how far you are with this already? This is super super helpful with maintenance since releases can be minted with minimal effort.

Sure, yeah -- happy to discuss.

I suspect the only required change would be to add a LT_PYPI_TOKEN secret (presumably this already exists), and make the following changes in the deploy-to-pypi workflow:

diff --git a/.github/workflows/deploy-to-pypi.yml b/.github/workflows/deploy-to-pypi.yml
index 8ab24da..c126980 100644
--- a/.github/workflows/deploy-to-pypi.yml
+++ b/.github/workflows/deploy-to-pypi.yml
@@ -10,7 +10,7 @@ jobs:
     name: deploy
     runs-on: ubuntu-latest
     env:
-      UV_PUBLISH_TOKEN: ${{ secrets.TEST_PYPI_TOKEN }}
+      UV_PUBLISH_TOKEN: ${{ secrets.LT_PYPI_TOKEN }}
 
     steps:
       - uses: actions/checkout@v6
@@ -27,4 +27,4 @@ jobs:
         run: uv build
 
       - name: Publish to TestPyPI
-        run: uv publish --publish-url https://test.pypi.org/legacy/
+        run: uv publish

@gvarnavi gvarnavi merged commit 2283601 into dev Jan 23, 2026
4 checks passed
@sk1p sk1p deleted the uellue-patch-1 branch January 23, 2026 18:06
@sk1p
Copy link
Member

sk1p commented Jan 23, 2026

Before release, maybe you can have a discussion with @sk1p about how to set up a release pipeline, not sure how far you are with this already? This is super super helpful with maintenance since releases can be minted with minimal effort.

Sure, yeah -- happy to discuss.

I suspect the only required change would be to add a LT_PYPI_TOKEN secret (presumably this already exists), and make the following changes in the deploy-to-pypi workflow:

Looks reasonable. The token is currently per-repo, but I've added one under that name. I would also change the PyPI workflow to only run on tags. I'll link the example from another project.

These lines make sure the workflow is also running on pushes to tags. I think in case of this repo, only the tags key should be sufficient:

https://github.com/LiberTEM/LiberTEM-rs/blob/db8f4669a1f0861a268bcc5dadfa98dd9d53376d/.github/workflows/CI.yml#L9-L10

In case of LiberTEM-rs, the "release" job is integrated into the main CI workflow, so it needs to be skipped on anything that is not a tag run. As the deploy workflow is separate here, I think we don't need it here (just for completeness):

https://github.com/LiberTEM/LiberTEM-rs/blob/db8f4669a1f0861a268bcc5dadfa98dd9d53376d/.github/workflows/CI.yml#L225

Optionally, I would recommend to upload release assets also to the GitHub release. I'm using softprops/action-gh-release for this, for example:

https://github.com/LiberTEM/LiberTEM-rs/blob/db8f4669a1f0861a268bcc5dadfa98dd9d53376d/.github/workflows/CI.yml#L240-L243

With this setup, the release procedure (which needs a short mention in the docs) becomes:

  1. Bump version in package metadata. Merge that into the main branch.
  2. Create a tag for the resulting commit from step 1. Once the tag is created, the pipeline including the release job will run. The tag can be created together with a GitHub release.
  3. Anything else is automated, when the pipeline is finished running, the resulting sdists and wheels will be available on both PyPI and on the GitHub release page. There's an automatic correspondence between git tags, GitHub release versions and PyPI versions.

I'll be out of office the next week, but do let me know if you need any support.

BTW: docs might also need a paragraph about the dev/main branches

@sk1p sk1p mentioned this pull request Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Initial code review

3 participants