Skip to content

Commit a62e117

Browse files
Add SNPE-B to implemented methods
1 parent f2fd1ab commit a62e117

File tree

2 files changed

+132
-32
lines changed

2 files changed

+132
-32
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -228,7 +228,7 @@ Conduct](CODE_OF_CONDUCT.md).
228228
`sbi` is the successor (using PyTorch) of the
229229
[`delfi`](https://github.com/mackelab/delfi) package. It started as a fork of Conor M.
230230
Durkan's `lfi`. `sbi` runs as a community project. See also
231-
[credits](https://github.com/sbi-dev/sbi/blob/master/docs/docs/credits.md).
231+
[credits](https://github.com/sbi-dev/sbi/blob/master/docs/credits.md).
232232

233233
## Support
234234

docs/tutorials/16_implemented_methods.ipynb

Lines changed: 131 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -18,41 +18,90 @@
1818
"- Utilities\n"
1919
]
2020
},
21+
{
22+
"cell_type": "code",
23+
"execution_count": null,
24+
"id": "cac88848",
25+
"metadata": {},
26+
"outputs": [],
27+
"source": [
28+
"# Example setup\n",
29+
"import torch\n",
30+
"\n",
31+
"from sbi.utils import BoxUniform\n",
32+
"\n",
33+
"# Define the prior\n",
34+
"num_dims = 2\n",
35+
"num_sims = 1000\n",
36+
"num_rounds = 2\n",
37+
"prior = BoxUniform(low=torch.zeros(num_dims), high=torch.ones(num_dims))\n",
38+
"simulator = lambda theta: theta + torch.randn_like(theta) * 0.1\n",
39+
"x_o = torch.tensor([0.5, 0.5])"
40+
]
41+
},
2142
{
2243
"cell_type": "markdown",
2344
"id": "1e608393",
2445
"metadata": {},
2546
"source": [
26-
"## Posterior estimation (NPE)\n"
47+
"## Posterior estimation (NPE)\n",
48+
"\n",
49+
"The core idea of Neural Posterior Estimation (NPE) is to train a conditional generative\n",
50+
"model that directly predicts the posterior given observations. This idea was originally\n",
51+
"developed by Papamakarios & Murray (NeurIPS 2016). The default implementation of NPE in\n",
52+
"the `sbi` package follows Greenberg, Nonnenmacher & Macke, who proposed NPE with\n",
53+
"normalizing flows:"
2754
]
2855
},
2956
{
3057
"cell_type": "markdown",
31-
"id": "fe4bc6b4",
58+
"id": "8f1aa30f",
3259
"metadata": {},
3360
"source": [
34-
"**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex)\n"
61+
"As such, for the default implementation of `NPE` in the `sbi` package, we recommend to\n",
62+
"cite:\n",
63+
"\n",
64+
"**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf)\n",
65+
"\n",
66+
"**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)"
3567
]
3668
},
3769
{
3870
"cell_type": "code",
3971
"execution_count": null,
40-
"id": "aa3893ea",
72+
"id": "8e1c726d",
4173
"metadata": {},
4274
"outputs": [],
4375
"source": [
44-
"# Example setup\n",
45-
"import torch\n",
76+
"from sbi.inference import NPE\n",
4677
"\n",
47-
"from sbi.utils import BoxUniform\n",
78+
"inference = NPE(prior)\n",
79+
"theta = prior.sample((num_sims,))\n",
80+
"x = simulator(theta)\n",
81+
"inference.append_simulations(theta, x).train()\n",
82+
"posterior = inference.build_posterior()\n",
83+
"samples = posterior.sample((1000,), x=x_o)"
84+
]
85+
},
86+
{
87+
"cell_type": "markdown",
88+
"id": "ba8bf467",
89+
"metadata": {},
90+
"source": [
91+
"Beyond this, the `sbi` package implements many modifications and extensions of these algorithms for Neural Posterior Estimation, which are outlined below."
92+
]
93+
},
94+
{
95+
"cell_type": "markdown",
96+
"id": "fe4bc6b4",
97+
"metadata": {},
98+
"source": [
99+
"**Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf)\n",
48100
"\n",
49-
"# Define the prior\n",
50-
"num_dims = 2\n",
51-
"num_sims = 1000\n",
52-
"num_rounds = 2\n",
53-
"prior = BoxUniform(low=torch.zeros(num_dims), high=torch.ones(num_dims))\n",
54-
"simulator = lambda theta: theta + torch.randn_like(theta) * 0.1\n",
55-
"x_o = torch.tensor([0.5, 0.5])"
101+
"Papamakarios et al. (2016) were the first to use neural networks to directly predict\n",
102+
"the posterior given observations. As density estimator, they used mixture density\n",
103+
"networks. In addition, they also proposed a sequential algorithm for neural posterior\n",
104+
"estimation. Their full algorithm can be implemented as follows:"
56105
]
57106
},
58107
{
@@ -77,24 +126,72 @@
77126
" proposal = posterior"
78127
]
79128
},
129+
{
130+
"cell_type": "markdown",
131+
"id": "70b4d312",
132+
"metadata": {},
133+
"source": [
134+
"**Flexible statistical inference for mechanistic models of neural dynamics**<br>by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher, Macke (NeurIPS 2017) <br>[[PDF]](https://proceedings.neurips.cc/paper/2017/hash/addfa9b7e234254d26e9c7f2af1005cb-Abstract.html)\n",
135+
"\n",
136+
"Like Papamakarios et al. (2016), Lueckmann et al. (2017) used mixture density networks for\n",
137+
"NPE. In addition, they also proposed embedding networks for time series, and proposed a\n",
138+
"different loss function for the sequential version of the algorithm."
139+
]
140+
},
141+
{
142+
"cell_type": "code",
143+
"execution_count": null,
144+
"id": "98c3cd2e",
145+
"metadata": {},
146+
"outputs": [],
147+
"source": [
148+
"from sbi.inference import NPE_B\n",
149+
"from sbi.neural_nets import posterior_nn\n",
150+
"from sbi.neural_nets.embedding_nets import FCEmbedding\n",
151+
"\n",
152+
"embedding = FCEmbedding(num_dims)\n",
153+
"density_estimator = posterior_nn(\"mdn\", embedding_net=embedding)\n",
154+
"inference = NPE_B(prior, density_estimator=density_estimator)\n",
155+
"\n",
156+
"proposal = prior\n",
157+
"for _ in range(num_rounds):\n",
158+
" theta = proposal.sample((num_sims,))\n",
159+
" x = simulator(theta)\n",
160+
" _ = inference.append_simulations(theta, x, proposal=proposal).train()\n",
161+
" posterior = inference.build_posterior().set_default_x(x_o)\n",
162+
" proposal = posterior"
163+
]
164+
},
80165
{
81166
"cell_type": "markdown",
82167
"id": "5ddd7f43",
83168
"metadata": {},
84169
"source": [
85-
"**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)\n"
170+
"**Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf)\n",
171+
"\n",
172+
"Greenberg, Nonnenmacher & Macke were the first to use normalizing flows for neural\n",
173+
"posterior estimation (NPE), which is the default for the `NPE` class in the `sbi`\n",
174+
"package (see above). In addition, they also proposed embedding networks\n",
175+
"in combination with normalizing flows, and proposed a modified loss function for the\n",
176+
"sequential version of the neural posterior estimation. These additional contributions\n",
177+
"can be implemented as follows:"
86178
]
87179
},
88180
{
89181
"cell_type": "code",
90182
"execution_count": null,
91-
"id": "b7d8514e",
183+
"id": "70cf7242",
92184
"metadata": {},
93185
"outputs": [],
94186
"source": [
95187
"from sbi.inference import NPE\n",
188+
"from sbi.neural_nets import posterior_nn\n",
189+
"from sbi.neural_nets.embedding_nets import FCEmbedding\n",
190+
"\n",
191+
"embedding = FCEmbedding(num_dims)\n",
192+
"density_estimator = posterior_nn(\"maf\", embedding_net=embedding)\n",
193+
"inference = NPE(prior, density_estimator=density_estimator)\n",
96194
"\n",
97-
"inference = NPE(prior)\n",
98195
"proposal = prior\n",
99196
"for _ in range(num_rounds):\n",
100197
" theta = proposal.sample((num_sims,))\n",
@@ -114,10 +211,9 @@
114211
"U. (2020) (IEEE transactions on neural networks and learning systems 2020)<br>\n",
115212
"[Paper](https://ieeexplore.ieee.org/abstract/document/9298920)\n",
116213
"\n",
117-
"The density estimation part of BayesFlow is equivalent to single-round NPE. The\n",
118-
"additional contribution of the paper are several embedding networks for high-dimensional\n",
119-
"data including permutation invariant embeddings. Similar embeddings networks are\n",
120-
"implemented in `sbi` as well, under `sbi.neural_nets.embedding_nets`."
214+
"The density estimation part of BayesFlow is equivalent to single-round NPE (Greenberg\n",
215+
"et al., 2019). The additional contribution of BayesFlow are permutation invariant\n",
216+
"embedding networks for iid data, which can be implemented in `sbi` as follows:"
121217
]
122218
},
123219
{
@@ -129,13 +225,22 @@
129225
"source": [
130226
"# Posterior estimation with BayesFlow is equivalent to single-round NPE.\n",
131227
"from sbi.inference import NPE\n",
228+
"from sbi.neural_nets import posterior_nn\n",
229+
"from sbi.neural_nets.embedding_nets import FCEmbedding, PermutationInvariantEmbedding\n",
230+
"\n",
231+
"num_iid = 3\n",
232+
"simulator_iid = lambda theta: theta + torch.randn((num_iid, *theta.shape)) * 0.1\n",
233+
"\n",
234+
"trial_net = FCEmbedding(num_dims, 20)\n",
235+
"embedding = PermutationInvariantEmbedding(trial_net, 20)\n",
236+
"density_estimator = posterior_nn(\"maf\", embedding_net=embedding)\n",
237+
"inference = NPE(prior, density_estimator=density_estimator)\n",
132238
"\n",
133-
"inference = NPE(prior)\n",
134239
"theta = prior.sample((num_sims,))\n",
135-
"x = simulator(theta)\n",
240+
"x = simulator_iid(theta).permute(1, 0, 2)\n",
136241
"inference.append_simulations(theta, x).train()\n",
137242
"posterior = inference.build_posterior()\n",
138-
"samples = posterior.sample((1000,), x=x_o)"
243+
"samples = posterior.sample((1000,), x=torch.zeros((1, num_iid, num_dims)))"
139244
]
140245
},
141246
{
@@ -239,7 +344,7 @@
239344
"id": "90120ff8",
240345
"metadata": {},
241346
"source": [
242-
"**Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib)\n"
347+
"**Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf)\n"
243348
]
244349
},
245350
{
@@ -590,7 +695,7 @@
590695
],
591696
"metadata": {
592697
"kernelspec": {
593-
"display_name": "Python 3 (ipykernel)",
698+
"display_name": "sbi",
594699
"language": "python",
595700
"name": "python3"
596701
},
@@ -618,11 +723,6 @@
618723
"toc_position": {},
619724
"toc_section_display": true,
620725
"toc_window_display": false
621-
},
622-
"vscode": {
623-
"interpreter": {
624-
"hash": "c50aa3a452b5e33eec699c3d0adceaddf116b15627c63bb6b43782d4547b8f5a"
625-
}
626726
}
627727
},
628728
"nbformat": 4,

0 commit comments

Comments
 (0)