|
18 | 18 | "- Utilities\n" |
19 | 19 | ] |
20 | 20 | }, |
| 21 | + { |
| 22 | + "cell_type": "code", |
| 23 | + "execution_count": null, |
| 24 | + "id": "cac88848", |
| 25 | + "metadata": {}, |
| 26 | + "outputs": [], |
| 27 | + "source": [ |
| 28 | + "# Example setup\n", |
| 29 | + "import torch\n", |
| 30 | + "\n", |
| 31 | + "from sbi.utils import BoxUniform\n", |
| 32 | + "\n", |
| 33 | + "# Define the prior\n", |
| 34 | + "num_dims = 2\n", |
| 35 | + "num_sims = 1000\n", |
| 36 | + "num_rounds = 2\n", |
| 37 | + "prior = BoxUniform(low=torch.zeros(num_dims), high=torch.ones(num_dims))\n", |
| 38 | + "simulator = lambda theta: theta + torch.randn_like(theta) * 0.1\n", |
| 39 | + "x_o = torch.tensor([0.5, 0.5])" |
| 40 | + ] |
| 41 | + }, |
21 | 42 | { |
22 | 43 | "cell_type": "markdown", |
23 | 44 | "id": "1e608393", |
24 | 45 | "metadata": {}, |
25 | 46 | "source": [ |
26 | | - "## Posterior estimation (NPE)\n" |
| 47 | + "## Posterior estimation (NPE)\n", |
| 48 | + "\n", |
| 49 | + "The core idea of Neural Posterior Estimation (NPE) is to train a conditional generative\n", |
| 50 | + "model that directly predicts the posterior given observations. This idea was originally\n", |
| 51 | + "developed by Papamakarios & Murray (NeurIPS 2016). The default implementation of NPE in\n", |
| 52 | + "the `sbi` package follows Greenberg, Nonnenmacher & Macke, who proposed NPE with\n", |
| 53 | + "normalizing flows:" |
27 | 54 | ] |
28 | 55 | }, |
29 | 56 | { |
30 | 57 | "cell_type": "markdown", |
31 | | - "id": "fe4bc6b4", |
| 58 | + "id": "8f1aa30f", |
32 | 59 | "metadata": {}, |
33 | 60 | "source": [ |
34 | | - "**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex)\n" |
| 61 | + "As such, for the default implementation of `NPE` in the `sbi` package, we recommend to\n", |
| 62 | + "cite:\n", |
| 63 | + "\n", |
| 64 | + "**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf)\n", |
| 65 | + "\n", |
| 66 | + "**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)" |
35 | 67 | ] |
36 | 68 | }, |
37 | 69 | { |
38 | 70 | "cell_type": "code", |
39 | 71 | "execution_count": null, |
40 | | - "id": "aa3893ea", |
| 72 | + "id": "8e1c726d", |
41 | 73 | "metadata": {}, |
42 | 74 | "outputs": [], |
43 | 75 | "source": [ |
44 | | - "# Example setup\n", |
45 | | - "import torch\n", |
| 76 | + "from sbi.inference import NPE\n", |
46 | 77 | "\n", |
47 | | - "from sbi.utils import BoxUniform\n", |
| 78 | + "inference = NPE(prior)\n", |
| 79 | + "theta = prior.sample((num_sims,))\n", |
| 80 | + "x = simulator(theta)\n", |
| 81 | + "inference.append_simulations(theta, x).train()\n", |
| 82 | + "posterior = inference.build_posterior()\n", |
| 83 | + "samples = posterior.sample((1000,), x=x_o)" |
| 84 | + ] |
| 85 | + }, |
| 86 | + { |
| 87 | + "cell_type": "markdown", |
| 88 | + "id": "ba8bf467", |
| 89 | + "metadata": {}, |
| 90 | + "source": [ |
| 91 | + "Beyond this, the `sbi` package implements many modifications and extensions of these algorithms for Neural Posterior Estimation, which are outlined below." |
| 92 | + ] |
| 93 | + }, |
| 94 | + { |
| 95 | + "cell_type": "markdown", |
| 96 | + "id": "fe4bc6b4", |
| 97 | + "metadata": {}, |
| 98 | + "source": [ |
| 99 | + "**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf)\n", |
48 | 100 | "\n", |
49 | | - "# Define the prior\n", |
50 | | - "num_dims = 2\n", |
51 | | - "num_sims = 1000\n", |
52 | | - "num_rounds = 2\n", |
53 | | - "prior = BoxUniform(low=torch.zeros(num_dims), high=torch.ones(num_dims))\n", |
54 | | - "simulator = lambda theta: theta + torch.randn_like(theta) * 0.1\n", |
55 | | - "x_o = torch.tensor([0.5, 0.5])" |
| 101 | + "Papamakarios et al. (2016) were the first to use neural networks to directly predict\n", |
| 102 | + "the posterior given observations. As density estimator, they used mixture density\n", |
| 103 | + "networks. In addition, they also proposed a sequential algorithm for neural posterior\n", |
| 104 | + "estimation. Their full algorithm can be implemented as follows:" |
56 | 105 | ] |
57 | 106 | }, |
58 | 107 | { |
|
77 | 126 | " proposal = posterior" |
78 | 127 | ] |
79 | 128 | }, |
| 129 | + { |
| 130 | + "cell_type": "markdown", |
| 131 | + "id": "70b4d312", |
| 132 | + "metadata": {}, |
| 133 | + "source": [ |
| 134 | + "**Flexible statistical inference for mechanistic models of neural dynamics**<br>by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher, Macke (NeurIPS 2017) <br>[[PDF]](https://proceedings.neurips.cc/paper/2017/hash/addfa9b7e234254d26e9c7f2af1005cb-Abstract.html)\n", |
| 135 | + "\n", |
| 136 | + "Like Papamakarios et al. (2016), Lueckmann et al. (2017) used mixture density networks for\n", |
| 137 | + "NPE. In addition, they also proposed embedding networks for time series, and proposed a\n", |
| 138 | + "different loss function for the sequential version of the algorithm." |
| 139 | + ] |
| 140 | + }, |
| 141 | + { |
| 142 | + "cell_type": "code", |
| 143 | + "execution_count": null, |
| 144 | + "id": "98c3cd2e", |
| 145 | + "metadata": {}, |
| 146 | + "outputs": [], |
| 147 | + "source": [ |
| 148 | + "from sbi.inference import NPE_B\n", |
| 149 | + "from sbi.neural_nets import posterior_nn\n", |
| 150 | + "from sbi.neural_nets.embedding_nets import FCEmbedding\n", |
| 151 | + "\n", |
| 152 | + "embedding = FCEmbedding(num_dims)\n", |
| 153 | + "density_estimator = posterior_nn(\"mdn\", embedding_net=embedding)\n", |
| 154 | + "inference = NPE_B(prior, density_estimator=density_estimator)\n", |
| 155 | + "\n", |
| 156 | + "proposal = prior\n", |
| 157 | + "for _ in range(num_rounds):\n", |
| 158 | + " theta = proposal.sample((num_sims,))\n", |
| 159 | + " x = simulator(theta)\n", |
| 160 | + " _ = inference.append_simulations(theta, x, proposal=proposal).train()\n", |
| 161 | + " posterior = inference.build_posterior().set_default_x(x_o)\n", |
| 162 | + " proposal = posterior" |
| 163 | + ] |
| 164 | + }, |
80 | 165 | { |
81 | 166 | "cell_type": "markdown", |
82 | 167 | "id": "5ddd7f43", |
83 | 168 | "metadata": {}, |
84 | 169 | "source": [ |
85 | | - "**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)\n" |
| 170 | + "**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)\n", |
| 171 | + "\n", |
| 172 | + "Greenberg, Nonnenmacher & Macke were the first to use normalizing flows for neural\n", |
| 173 | + "posterior estimation (NPE), which is the default for the `NPE` class in the `sbi`\n", |
| 174 | + "package (see above). In addition, they also proposed embedding networks\n", |
| 175 | + "in combination with normalizing flows, and proposed a modified loss function for the\n", |
| 176 | + "sequential version of the neural posterior estimation. These additional contributions\n", |
| 177 | + "can be implemented as follows:" |
86 | 178 | ] |
87 | 179 | }, |
88 | 180 | { |
89 | 181 | "cell_type": "code", |
90 | 182 | "execution_count": null, |
91 | | - "id": "b7d8514e", |
| 183 | + "id": "70cf7242", |
92 | 184 | "metadata": {}, |
93 | 185 | "outputs": [], |
94 | 186 | "source": [ |
95 | 187 | "from sbi.inference import NPE\n", |
| 188 | + "from sbi.neural_nets import posterior_nn\n", |
| 189 | + "from sbi.neural_nets.embedding_nets import FCEmbedding\n", |
| 190 | + "\n", |
| 191 | + "embedding = FCEmbedding(num_dims)\n", |
| 192 | + "density_estimator = posterior_nn(\"maf\", embedding_net=embedding)\n", |
| 193 | + "inference = NPE(prior, density_estimator=density_estimator)\n", |
96 | 194 | "\n", |
97 | | - "inference = NPE(prior)\n", |
98 | 195 | "proposal = prior\n", |
99 | 196 | "for _ in range(num_rounds):\n", |
100 | 197 | " theta = proposal.sample((num_sims,))\n", |
|
114 | 211 | "U. (2020) (IEEE transactions on neural networks and learning systems 2020)<br>\n", |
115 | 212 | "[Paper](https://ieeexplore.ieee.org/abstract/document/9298920)\n", |
116 | 213 | "\n", |
117 | | - "The density estimation part of BayesFlow is equivalent to single-round NPE. The\n", |
118 | | - "additional contribution of the paper are several embedding networks for high-dimensional\n", |
119 | | - "data including permutation invariant embeddings. Similar embeddings networks are\n", |
120 | | - "implemented in `sbi` as well, under `sbi.neural_nets.embedding_nets`." |
| 214 | + "The density estimation part of BayesFlow is equivalent to single-round NPE (Greenberg\n", |
| 215 | + "et al., 2019). The additional contribution of BayesFlow are permutation invariant\n", |
| 216 | + "embedding networks for iid data, which can be implemented in `sbi` as follows:" |
121 | 217 | ] |
122 | 218 | }, |
123 | 219 | { |
|
129 | 225 | "source": [ |
130 | 226 | "# Posterior estimation with BayesFlow is equivalent to single-round NPE.\n", |
131 | 227 | "from sbi.inference import NPE\n", |
| 228 | + "from sbi.neural_nets import posterior_nn\n", |
| 229 | + "from sbi.neural_nets.embedding_nets import FCEmbedding, PermutationInvariantEmbedding\n", |
| 230 | + "\n", |
| 231 | + "num_iid = 3\n", |
| 232 | + "simulator_iid = lambda theta: theta + torch.randn((num_iid, *theta.shape)) * 0.1\n", |
| 233 | + "\n", |
| 234 | + "trial_net = FCEmbedding(num_dims, 20)\n", |
| 235 | + "embedding = PermutationInvariantEmbedding(trial_net, 20)\n", |
| 236 | + "density_estimator = posterior_nn(\"maf\", embedding_net=embedding)\n", |
| 237 | + "inference = NPE(prior, density_estimator=density_estimator)\n", |
132 | 238 | "\n", |
133 | | - "inference = NPE(prior)\n", |
134 | 239 | "theta = prior.sample((num_sims,))\n", |
135 | | - "x = simulator(theta)\n", |
| 240 | + "x = simulator_iid(theta).permute(1, 0, 2)\n", |
136 | 241 | "inference.append_simulations(theta, x).train()\n", |
137 | 242 | "posterior = inference.build_posterior()\n", |
138 | | - "samples = posterior.sample((1000,), x=x_o)" |
| 243 | + "samples = posterior.sample((1000,), x=torch.zeros((1, num_iid, num_dims)))" |
139 | 244 | ] |
140 | 245 | }, |
141 | 246 | { |
|
239 | 344 | "id": "90120ff8", |
240 | 345 | "metadata": {}, |
241 | 346 | "source": [ |
242 | | - "**Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib)\n" |
| 347 | + "**Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf)\n" |
243 | 348 | ] |
244 | 349 | }, |
245 | 350 | { |
|
590 | 695 | ], |
591 | 696 | "metadata": { |
592 | 697 | "kernelspec": { |
593 | | - "display_name": "Python 3 (ipykernel)", |
| 698 | + "display_name": "sbi", |
594 | 699 | "language": "python", |
595 | 700 | "name": "python3" |
596 | 701 | }, |
|
618 | 723 | "toc_position": {}, |
619 | 724 | "toc_section_display": true, |
620 | 725 | "toc_window_display": false |
621 | | - }, |
622 | | - "vscode": { |
623 | | - "interpreter": { |
624 | | - "hash": "c50aa3a452b5e33eec699c3d0adceaddf116b15627c63bb6b43782d4547b8f5a" |
625 | | - } |
626 | 726 | } |
627 | 727 | }, |
628 | 728 | "nbformat": 4, |
|
0 commit comments