You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: paper/paper.bib
+3-13Lines changed: 3 additions & 13 deletions
Original file line number
Diff line number
Diff line change
@@ -59,7 +59,7 @@ @misc{Feng2023
59
59
}
60
60
61
61
@misc{Feng2024,
62
-
title={Variational Bayesian Imaging with an Efficient Surrogate Score-based Prior},
62
+
title={{V}ariational {B}ayesian {I}maging with an {E}fficient {S}urrogate {S}core-based {P}rior},
63
63
author={Berthy T. Feng and Katherine L. Bouman},
64
64
year={2024},
65
65
eprint={2309.01949},
@@ -159,7 +159,7 @@ @misc{ambientdiffusion
159
159
}
160
160
161
161
@article{emulating,
162
-
title={<scp>CosmoPower</scp>: emulating cosmological power spectra for accelerated Bayesian inference from next-generation surveys},
162
+
title={CosmoPower: emulating cosmological power spectra for accelerated {B}ayesian inference from next-generation surveys},
163
163
volume={511},
164
164
ISSN={1365-2966},
165
165
url={http://dx.doi.org/10.1093/mnras/stac064},
@@ -224,16 +224,6 @@ @misc{ldms
224
224
url={https://arxiv.org/abs/2112.10752},
225
225
}
226
226
227
-
@misc{dits,
228
-
title={Scalable Diffusion Models with Transformers},
229
-
author={William Peebles and Saining Xie},
230
-
year={2023},
231
-
eprint={2212.09748},
232
-
archivePrefix={arXiv},
233
-
primaryClass={cs.CV},
234
-
url={https://arxiv.org/abs/2212.09748},
235
-
}
236
-
237
227
@misc{gans,
238
228
title={Generative Adversarial Networks},
239
229
author={Ian J. Goodfellow and Jean Pouget-Abadie and Mehdi Mirza and Bing Xu and David Warde-Farley and Sherjil Ozair and Aaron Courville and Yoshua Bengio},
Copy file name to clipboardExpand all lines: paper/paper.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@ aas-journal: Astrophysical Journal <- The name of the AAS journal.
30
30
31
31
# Summary
32
32
33
-
Diffusion models [@diffusion; @ddpm; @sde] have emerged as the dominant paradigm for generative modeling based on performance in a variety of tasks [@ldms; @dits]. The advantages of accurate density estimation and high-quality samples of normalising flows [@flows; @ffjord], VAEs [@vaes] and GANs [@gans] are subsumed into this method. Significant limitations exist on implicit and neural network based likelihood models with respect to modeling normalised probability distributions and sampling speed. Score-matching diffusion models are more efficient than previous generative model algorithms for these tasks. The diffusion process is agnostic to the data representation meaning different types of data such as audio, point-clouds, videos and images can be modelled.
33
+
Diffusion models [@diffusion; @ddpm; @sde] have emerged as the dominant paradigm for generative modeling based on performance in a variety of tasks [@ldms; @dit]. The advantages of accurate density estimation and high-quality samples of normalising flows [@flows; @ffjord], VAEs [@vaes] and GANs [@gans] are subsumed into this method. Significant limitations exist on implicit and neural network based likelihood models with respect to modeling normalised probability distributions and sampling speed. Score-matching diffusion models are more efficient than previous generative model algorithms for these tasks. The diffusion process is agnostic to the data representation meaning different types of data such as audio, point-clouds, videos and images can be modelled.
0 commit comments