torch/autograd support for the Mie backend#480
Open
edudc wants to merge 8 commits into
Open
Conversation
added 8 commits
May 12, 2026 14:33
… propagation use, and stratified coefficients to preserve torch tensors/autograd
…ld summation, propagation matrix autodiff, and Zernike coefficient gradients
…umination angle; this now uses the equivalent cos_theta branch
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This makes torch
MieSphererendering differentiable through bothgeometricandhybridmodes, including propagation and optical field assembly. The Mie backend now uses DeepTrack's array API namespace where possible, using the existing SciPy based NumPy path for special functions.Prerequisites
This PR depends on two active prerequisite fixes from other PRs:
optics.py: al/bf-mie-fix #478aberrations.py: fix z-coeffs skip when 0 for torch #479Please merge them first.
Changes
deeptrack.backend.mieto support torch tensors and autograd.MieScattererfield construction to use Python array API operations for:get_propagation_matrixbackend-aware so torch inputs remain differentiable.MieSpherecoefficients so gradients flow to radius andrefractive index.
MieStratifiedSpheretensor safe where it shares the same conversion path.MieSphere.resolve()autodiff ingeometricandhybridmodesget_propagation_matrixNotes
polynomials.pycurrently depends on SciPy’s arbitrary order cylindrical Bessel/Hankel functions.torch.specialdoes have Bessel functions, but does not cover arbitrary orderjv/yvorh1vp. Also PyTorch's implementation does not work with autograd. In 2.11.0 this fails:because
yhas nograd_fn. (PyTorch forum discussion about this: https://discuss.pytorch.org/t/a-problem-about-autograd-and-special-bessel-j0/192062).For integer-order Riccati-Bessel functions used by Mie, we can compute them from
sin,cosand the recurrence relations, which makes it autograd friendly (and works on GPU).