Skip to content

Latest commit

 

History

History
50 lines (39 loc) · 1.62 KB

File metadata and controls

50 lines (39 loc) · 1.62 KB

Generative diffusion posterior sampling for informative likelihoods

This implementation is associated with the paper "Generative diffusion posterior sampling for informative likelihoods" http://arxiv.org/abs/2506.01083. In the paper we develop a new approach for conditional sampling of generative diffusion models with sequential Monte Carlo methods.

Installation

Install the package via a standard procedure:

git clone [email protected]:zgbkdlm/gfk.git
cd gfk
pip install -e .

Depending on whether you need to run in a CPU/GPU, you may want to uninstall jaxand jaxlib and then reinstall.

Reproduce experiments

To exactly reproduce the numbers and figures in the paper, first run experiments:

cd experiments
python runs_gms/bash_aux.sh --dx=256 --nparticles=16384
python runs_gms/bash_aux_noiseless.sh --dx=256 --nparticles=16384
python runs_gms/bash_mcgdiff.sh --dx=256 --nparticles=16384
python runs_gms/bash_wu.sh --dx=256 --nparticles=16384

Then, run the scripts in ./summary to produce the tables and figures, e.g.,

cd experiements
python ./summary/tabulate_gms.py

will produce the table.

Citation

@article{Zhao2025b0smc, 
    author = {Zhao, Zheng}, 
    title = {Generative diffusion posterior sampling for informative likelihoods},
    journal = {Communications in Information and Systems},
    note = {Special issue for celebrating Thomas Kailath's 90th birthday}, 
    year = {2025},
}

Contact

Zheng Zhao, Linköping University, https://zz.zabemon.com.