-
Notifications
You must be signed in to change notification settings - Fork 169
Adding FNO and DeepONets to Neuromancer #265
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
…, loss defintions etc
Squashed commit of the following: commit f39161f Author: [email protected] <[email protected]> Date: Fri Nov 28 18:31:55 2025 -0500 Moved DeepXDEDataWrapper to operator.spy, final DeepXDE notebook cleanup commit 022c590 Author: [email protected] <[email protected]> Date: Fri Nov 28 18:11:34 2025 -0500 Working version of anti-derivative with DeepXDE DeepONetCartesianProd. Create correct data pipeline commit bc5b4c5 Author: [email protected] <[email protected]> Date: Fri Nov 28 11:32:11 2025 -0500 Final notebook for antiderivative example commit a83ff8e Author: [email protected] <[email protected]> Date: Fri Nov 28 11:14:37 2025 -0500 use Einsum instead of matmul for more general dim_x cases commit c7ae9fa Author: [email protected] <[email protected]> Date: Thu Nov 27 19:27:42 2025 -0500 Antiderivative notebook working version. Minimal changes in operators.py commit d31edba Merge: 1f97d43 d93bf89 Author: [email protected] <[email protected]> Date: Thu Nov 27 11:35:20 2025 -0500 Merge branch 'master' into feature/DeepONet commit 1f97d43 Author: Colby Ham <[email protected]> Date: Wed Aug 28 16:46:56 2024 -0700 remove data_dir and cut some old unused code commit 4e01c8f Author: Colby Ham <[email protected]> Date: Fri Aug 2 17:01:56 2024 -0700 Get model to work with generated data commit 2f02dcf Author: Colby Ham <[email protected]> Date: Mon Jul 29 09:38:11 2024 -0700 Add attempt at generating 1d data and antiderivative ground truth commit b3206d4 Author: Colby Ham <[email protected]> Date: Mon Jul 29 09:37:47 2024 -0700 add unfinished comments about shapes of DeepONet commit 0de774f Author: Colby Ham <[email protected]> Date: Tue Jul 2 15:16:52 2024 -0700 Remove extra code for getting data generation working from operators.py commit ed34ee2 Author: Colby Ham <[email protected]> Date: Tue Jul 2 15:03:50 2024 -0700 Push changes from trainer commit 032ae78 Author: Colby Ham <[email protected]> Date: Tue Jul 2 15:01:57 2024 -0700 Test changes on problem to see if autoformat issue commit 17f62bd Author: Colby Ham <[email protected]> Date: Tue Jul 2 15:00:06 2024 -0700 Attempt to add problem and trainer back in without format on save commit 57beff6 Author: Colby Ham <[email protected]> Date: Tue Jul 2 14:56:22 2024 -0700 Delete local version of my specific changes with too much formatting commit 1c08c96 Author: Colby Ham <[email protected]> Date: Tue Jul 2 14:55:15 2024 -0700 Move changes from problem back commit e84e2b1 Author: Colby Ham <[email protected]> Date: Tue Jul 2 14:54:11 2024 -0700 Move changes from trainer back commit ded1a01 Author: Colby Ham <[email protected]> Date: Tue Jul 2 14:51:53 2024 -0700 Add trainer and problem from develop, add separate copy i'm working on commit ab1f644 Author: Colby Ham <[email protected]> Date: Sat Jun 29 12:00:50 2024 -0700 Attempt to get data generation working commit 8a3c469 Author: Colby Ham <[email protected]> Date: Fri Jun 28 17:12:23 2024 -0700 Add latest progress working on data generation commit 3f9fb6c Author: Colby Ham <[email protected]> Date: Fri Jun 28 16:01:21 2024 -0700 Remove dataset download code from notebook commit 3b66c3c Author: Colby Ham <[email protected]> Date: Fri Jun 28 15:58:54 2024 -0700 Finish initial Class documentation for DeepONet commit be5f2c0 Author: Colby Ham <[email protected]> Date: Fri Jun 28 13:38:21 2024 -0700 Add current state of example from feature/deeponet_examples after messy rebase
…es/neural_operators
drgona
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can add the LossHistoryCallback class to callbacks.py
Since users will start from the first example, it would be good if the Part 1 example has extensive method markdown, similar to Part 3.
… we need our own implementation
| "pydot==1.4.2", | ||
| "pyts", | ||
| "torch", | ||
| "torch>=1.8", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For ifft in PyTorch needed for FNOs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should I add dependencies for neuralop and deepxde to the environment? Or would you like to keep it flexible?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It depends how many and what dependencies there are. If it is a lot, we can add the necessary dependencies for neuralop and deepxde to examples dependencies in our toml file.
https://github.com/pnnl/neuromancer/blob/master/pyproject.toml#L69
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, the deepxde library can be added like that, as we use it only in examples.
The neuraloperator library is being used here https://github.com/SOLARIS-JHU/neuromancer_NO/blob/feature/FNO/src/neuromancer/modules/operators.py#L6, so it will either need to be added as a main dependency. Or we could do something like this in pyproject.toml file, loading this only for operators.
examples = [
"deepxde",
"imageio",
"neuraloperator",
]
[project.optional-dependencies]
operators = [
"neuraloperator",
]Let me know what you prefer!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Formatting changes only (linter)
|
lets run all the unit tests on your on this PR to make sure we are not braking the core |
|
Well done with the examples!
Part_1_DeepONet_antiderivative_aligned.ipynb Part_2_DeepONet_antiderivative_unaligned.ipynb Part_5_FNO_DarcyFlow.ipynb Part_3_DeepONet_DarcyFlow.ipynb It is not apparent from the notebooks what is the main difference between: Similarly: Part_3_DeepONetCartesianProd_DiffusionEquation.ipynb this title does not match the content which shows FNO. It looks like deprecated file to be deleted. Part_4_FNO_1DAllenCahn.ipynb |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets make sure this dependency is included in our toml file:
from pathlib import Path
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pathlib is a standard library module. No dependency entry is required.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
since you introduce change to the core abstraction, we need to run all the tests to make sure we are not breaking the core
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
similarly here, lets run all the tests to make sure these changes to the core abstraction are backward compatible
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
similarly here, lets run all the tests to make sure these changes to the core abstraction are backward compatible
this passed with a few warnings
FNO examples: |

This PR updates #263.
It introduces DeepONets and Fourier Neural Operators (FNOs) into Neuromancer, with integrations to the external libraries neuralop and DeepXDE
Integration is provided via lightweight wrappers that preserve Neuromancer’s abstractions while maintaining compatibility with upstream libraries.
Summary of Changes
Operator Integrations
FNO Wrapper
src/neuromancer/modules/operators.pyA thin wrapper enabling direct use of
neuralopmodels inside Neuromancer. The wrapper attaches the required_metadatafor robust checkpointing across PyTorch versions. The design generalizes to any operator provided byneuralop.Native DeepONet (Cartesian Product)
src/neuromancer/modules/operators.pyNative implementation of DeepONet with Cartesian-product evaluation, consistent with standard operator-learning formulations.
DeepXDE Wrapper
src/neuromancer/modules/operators.pyWrapper for DeepONet-style models implemented in DeepXDE. Supports both
dde.nn.DeepONetCartesianProdanddde.nn.DeepONet.Custom Operator Losses
src/neuromancer/modules/operators.pyImplementation of
LpLossandH1Loss, suitable for grid-based operator learning.Training and Validation Infrastructure
Changes in
trainer.py,problem.py, andcallbacks.pyto support validation datasets during training (inspired by DeepONet Example #168).A new
LossHistoryCallbackis added tocallbacks.pyto track and plot training and validation loss per epoch.Implemented Notebooks
DeepONet Examples
Antiderivative (Aligned GRF Dataset)
examples/neural_operators/Part_1_DeepONet_antiderivative_aligned.ipynbInspiration: DeepXDE aligned antiderivative demo and Lu et al. (2021).
Antiderivative (Unaligned GRF Dataset)
examples/neural_operators/Part_2_DeepONet_antiderivative_unaligned.ipynbInspiration: DeepXDE unaligned antiderivative demo.
1D Advection equation (Aligned initial conditions from GRF fields)
examples/neural_operators/Part_6_PIDeepONet_Advection_aligned.ipynbInspiration: DeepXDE aligned 1D Advection demo.
FNO Examples
1D Allen–Cahn Equation (from-scratch FNO)
examples/neural_operators/Part_4_FNO_1DAllenCahn.ipynbInspiration: ETH “Operator Learning – FNO” tutorial and the original FNO paper.
Darcy Flow (FNO, Neuromancer pipeline)
examples/neural_operators/Part_5_FNO_DarcyFlow.ipynbInspiration:
neuralopDarcy Flow example and NVIDIA’s FNO Darcy guide.Diffusion Equation (FNO, Neuromancer pipeline)
examples/neural_operators/Part_6_FNO_DiffusionEquation.ipynbInspiration:
examples/PDEs/Part_1_PINN_DiffusionEquation.ipynb.Notebooks I could not implement (DeepONet)
The following notebooks are not included as they do not achieve accurate solutions using standard MLP DeepONets:
These are moved to another SOLARIS branch (
feature/DeepONet_notebooks_debug); we could revisit them later.Some papers try to solve DarcyFlow using NOs, might be useful for debugging. NVIDIA example on Darcy Flow using DeepONet uses a convnet-based model pix2pix in the branch net. Another paper (https://www.mdpi.com/2227-9717/13/9/2754) solves DarcyFlow using various methods
Remaining Work
neuralopSFNO SWE example:https://neuraloperator.github.io/dev/auto_examples/models/plot_SFNO_swe.html