Authors: Giacomo Baldan, Qiang Liu, Alberto Guardone, Nils Thuerey
Key contributions:
- Physics-Based Flow Matching (PBFM): Proposes a novel framework for integrating physical constraints into flow matching objectives, leveraging conflict-free gradient updates to minimize PDE and algebraic residuals simultaneously without manual weight tuning.
- Mitigation of Jensen’s Gap via Unrolling: Demonstrates that unrolled training trajectories effectively bridge the gap between training objectives and inference-time performance, yielding superior physical consistency without increasing the computational overhead of the final sampler.
-
Analysis of additional Gaussian Noise: Provides a theoretical and empirical analysis of the role of Gaussian noise in constrained flow matching, demonstrating how the choice of noise floor (
$\sigma_{\min}$ ) affects the trade-off between distributional accuracy and the precision of physical constraints. - Stochastic vs. Deterministic Sampling Analysis: Provides a formal analysis of the physics-vs-distribution trade-off, establishing the advantages of stochastic sampling and Gaussian noise injection for maintaining distributional fidelity under rigid physical priors.
- Seamless Integration: Offers a straightforward implementation strategy that can be integrated into existing flow matching pipelines, consistently improving both distributional accuracy and physical validity across multiple generative tasks.
Abstract: Physics-constrained generative modeling aims to produce high-dimensional samples that are both physically consistent and distributionally accurate, a task that remains challenging due to often conflicting optimization objectives. Recent advances in flow matching and diffusion models have enabled efficient generative modeling, but integrating physical constraints often degrades generative fidelity or requires costly inference-time corrections. Our work is the first to recognize the trade-off between distributional and physical accuracy. Based on the insight of inherently conflicting objectives, we introduce Physics-Based Flow Matching (PBFM) a method that enforces physical constraints at training time using conflict-free gradient updates and unrolling to mitigate Jensen's gap. Our approach avoids manual loss balancing and enables simultaneous optimization of generative and physical objectives. As a consequence, physics constraints do not impede inference performance. We benchmark our method across three representative PDE benchmarks. PBFM achieves a Pareto-optimal trade-off, competitive inference speed, and generalizes to a wide range of physics-constrained generative tasks, providing a practical tool for scientific machine learning.
Cite as:
@inproceedings{pbfm2026,
title={Physics vs Distributions: Pareto Optimal Flow Matching with Physics Constraints},
author={Giacomo Baldan and Qiang Liu and Alberto Guardone and Nils Thuerey},
booktitle={The Fourteenth International Conference on Learning Representations},
year={2026},
url={https://openreview.net/forum?id=tAf1KI3d4X}
}Install the required Python packages using pip:
pip install torch h5py torchfsm conflictfree einops timm findiff rotary_embedding_torch
Training requires at least one GPU and uses PyTorch's Distributed Data Parallel (DDP). To train the model on a single GPU, run:
torchrun --nnodes=1 --nproc_per_node=1 train_ddp.py
Pretrained model checkpoints for each test case are available in the logs/PBFM folder. To generate samples using the pretrained PBFM model, run:
python sample.py --version PBFM
See the reference paper for more details:
- Darcy flow
- Kolmogorov flow
- Dynamic stall
Kolmogorov flow and dynamic stall datasets are available from Hugging Face. For the Darcy flow dataset, see PIDM.
PBFM
├── darcy_flow
│ ├── train
│ │ ├── K_data.csv
│ │ └── p_data.csv
│ └── valid
│ ├── K_data.csv
│ └── p_data.csv
├── dynamic_stall
│ ├── dynamic_stall_test.h5
│ └── dynamic_stall_train.h5
└── kolmogorov_flow
├── kolmogorov_test.h5
└── kolmogorov_train.h5
