Skip to content

Commit 70560a3

Browse files
Fix broken links on website
1 parent e86ca97 commit 70560a3

17 files changed

+131
-230
lines changed

docs/index.rst

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -52,13 +52,13 @@ To get started, install the `sbi` package with:
5252
5353
python -m pip install sbi
5454
55-
for more advanced install options, see our `Install Guide <https://sbi.readthedocs.io/en/latest/installation.html>`.
55+
for more advanced install options, see our `Install Guide <https://sbi.readthedocs.io/en/latest/installation.html>`_.
5656

5757
Then, check out our material:
5858

5959
- `Tutorials and Examples <https://sbi.readthedocs.io/en/latest/tutorials.html>`_
6060

61-
- `Reference API <https://sbi.readthedocs.io/en/latest/reference.html>`_
61+
- `Reference API <https://sbi.readthedocs.io/en/latest/sbi.html>`_
6262

6363

6464
Motivation and approach
@@ -130,7 +130,7 @@ the inference on one particular observation to be more simulation-efficient
130130
(e.g., SNPE).
131131

132132
Below, we list all implemented methods and their corresponding publications.
133-
For usage in ``sbi``, see the `Inference API reference <https://sbi.readthedocs.io/en/latest/reference.html>`_
133+
For usage in ``sbi``, see the `Inference API reference <https://sbi.readthedocs.io/en/latest/sbi.html>`_
134134
and the `tutorial on implemented methods <https://sbi.readthedocs.io/en/latest/tutorials/16_implemented_methods.html>`_.
135135

136136

@@ -140,17 +140,14 @@ Posterior estimation (``(S)NPE``)
140140
- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**
141141
by Papamakarios & Murray (NeurIPS 2016)
142142
`PDF <https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf>`__
143-
`BibTeX <https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex>`__
144143

145144
- **Flexible statistical inference for mechanistic models of neural dynamics**
146145
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017)
147146
`PDF <https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf>`__
148-
`BibTeX <https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex>`__
149147

150148
- **Automatic posterior transformation for likelihood-free inference**
151149
by Greenberg, Nonnenmacher & Macke (ICML 2019)
152150
`PDF <http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf>`__
153-
`BibTeX`__
154151

155152
- **BayesFlow: Learning complex stochastic models with invertible neural networks**
156153
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (IEEE transactions on neural networks and learning systems 2020)
@@ -174,7 +171,6 @@ Likelihood-estimation (``(S)NLE``)
174171
- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**
175172
by Papamakarios, Sterratt & Murray (AISTATS 2019)
176173
`PDF <http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf>`__
177-
`BibTeX <https://gpapamak.github.io/bibtex/snl.bib>`__
178174

179175
- **Variational methods for simulation-based inference**
180176
by Glöckler, Deistler, Macke (ICLR 2022)

docs/tutorials/00_getting_started.ipynb

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"cell_type": "markdown",
1212
"metadata": {},
1313
"source": [
14-
"Note, you can find the original version of this notebook at [/tutorials/00_getting_started.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started.ipynb) in the `sbi` repository."
14+
"Note, you can find the original version of this notebook at [/docs/tutorials/00_getting_started.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/00_getting_started.ipynb) in the `sbi` repository."
1515
]
1616
},
1717
{
@@ -60,7 +60,7 @@
6060
"2. a candidate (mechanistic) model - _the simulator_ \n",
6161
"3. prior knowledge or constraints on model parameters - _the prior_\n",
6262
"\n",
63-
"If you are new to simulation-based inference, please first read the information on the [homepage of the website](https://sbi-dev.github.io/sbi/) to familiarise with the motivation and relevant terms."
63+
"If you are new to simulation-based inference, please first read the information on the [homepage of the website](https://sbi.readthedocs.io/en/latest/index.html) to familiarise with the motivation and relevant terms."
6464
]
6565
},
6666
{
@@ -138,9 +138,9 @@
138138
"cell_type": "markdown",
139139
"metadata": {},
140140
"source": [
141-
"> Note: In `sbi` version 0.23.0, we renamed all inference classes from, e.g., `SNPE`, to `NPE` (i.e., we removed the `S` prefix). The functionality of the classes remains the same. The `NPE` class handles both the amortized (as shown in this tutorial) and sequential (as shown [here](https://sbi-dev.github.io/sbi/tutorial/02_multiround_inference/)) versions of neural posterior estimation. An alias for `SNPE` still exists for backwards compatibility.\n",
141+
"> Note: In `sbi` version 0.23.0, we renamed all inference classes from, e.g., `SNPE`, to `NPE` (i.e., we removed the `S` prefix). The functionality of the classes remains the same. The `NPE` class handles both the amortized (as shown in this tutorial) and sequential (as shown [here](https://sbi.readthedocs.io/en/latest/tutorials/02_multiround_inference.html)) versions of neural posterior estimation. An alias for `SNPE` still exists for backwards compatibility.\n",
142142
"\n",
143-
"> Note: This is where you could specify an alternative inference object such as NRE for ratio estimation or NLE for likelihood estimation. Here, you can see [all implemented methods](https://sbi-dev.github.io/sbi/latest/tutorials/16_implemented_methods/)."
143+
"> Note: This is where you could specify an alternative inference object such as NRE for ratio estimation or NLE for likelihood estimation. Here, you can see [all implemented methods](https://sbi.readthedocs.io/en/latest/tutorials/16_implemented_methods.html)."
144144
]
145145
},
146146
{
@@ -456,11 +456,11 @@
456456
"## Next steps\n",
457457
"\n",
458458
"To learn more about the capabilities of `sbi`, you can head over to the tutorial\n",
459-
"[01_gaussian_amortized](01_gaussian_amortized.md), for inferring parameters for multiple\n",
459+
"[01_gaussian_amortized](https://sbi.readthedocs.io/en/latest/tutorials/01_gaussian_amortized.html), for inferring parameters for multiple\n",
460460
"observations without retraining.\n",
461461
"\n",
462462
"Alternatively, for an example with an __actual__ simulator, you can read our example\n",
463-
"for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](Example_00_HodgkinHuxleyModel.md)."
463+
"for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](https://sbi.readthedocs.io/en/latest/tutorials/Example_00_HodgkinHuxleyModel.html)."
464464
]
465465
}
466466
],
@@ -480,7 +480,7 @@
480480
"name": "python",
481481
"nbconvert_exporter": "python",
482482
"pygments_lexer": "ipython3",
483-
"version": "3.10.14"
483+
"version": "3.12.4"
484484
}
485485
},
486486
"nbformat": 4,

docs/tutorials/01_gaussian_amortized.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"cell_type": "markdown",
1212
"metadata": {},
1313
"source": [
14-
"Note, you can find the original version of this notebook at [tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
14+
"Note, you can find the original version of this notebook at [docs/tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
1515
]
1616
},
1717
{
@@ -20,7 +20,7 @@
2020
"source": [
2121
"In this tutorial, we introduce **amortization** that is the capability to evaluate the posterior for different observations without having to re-run inference.\n",
2222
"\n",
23-
"We will demonstrate how `sbi` can infer an amortized posterior for the illustrative linear Gaussian example introduced in [Getting Started](https://sbi-dev.github.io/sbi/latest/tutorials/00_getting_started/), that takes in 3 parameters ($\\theta$). "
23+
"We will demonstrate how `sbi` can infer an amortized posterior for the illustrative linear Gaussian example introduced in [Getting Started](https://sbi.readthedocs.io/en/latest/tutorials/00_getting_started.html), that takes in 3 parameters ($\\theta$). "
2424
]
2525
},
2626
{
@@ -214,10 +214,10 @@
214214
"cell_type": "markdown",
215215
"metadata": {},
216216
"source": [
217-
"# Next steps\n",
217+
"## Next steps\n",
218218
"\n",
219219
"Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial\n",
220-
"[02_multiround_inference](02_multiround_inference.md) which aims to make inference for a single observation more sampling efficient."
220+
"[02_multiround_inference](https://sbi.readthedocs.io/en/latest/tutorials/02_multiround_inference.html) which aims to make inference for a single observation more sampling efficient."
221221
]
222222
}
223223
],
@@ -237,7 +237,7 @@
237237
"name": "python",
238238
"nbconvert_exporter": "python",
239239
"pygments_lexer": "ipython3",
240-
"version": "3.10.14"
240+
"version": "3.12.4"
241241
}
242242
},
243243
"nbformat": 4,

docs/tutorials/02_multiround_inference.ipynb

Lines changed: 4 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
"cell_type": "markdown",
1818
"metadata": {},
1919
"source": [
20-
"Note, you can find the original version of this notebook at [tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
20+
"Note, you can find the original version of this notebook at [docs/tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
2121
]
2222
},
2323
{
@@ -54,14 +54,7 @@
5454
"name": "stdout",
5555
"output_type": "stream",
5656
"text": [
57-
" Neural network successfully converged after 196 epochs."
58-
]
59-
},
60-
{
61-
"name": "stdout",
62-
"output_type": "stream",
63-
"text": [
64-
"Using SNPE-C with atomic loss\n",
57+
" Neural network successfully converged after 196 epochs.Using SNPE-C with atomic loss\n",
6558
" Neural network successfully converged after 37 epochs."
6659
]
6760
}
@@ -173,14 +166,7 @@
173166
"name": "stdout",
174167
"output_type": "stream",
175168
"text": [
176-
" Neural network successfully converged after 277 epochs."
177-
]
178-
},
179-
{
180-
"name": "stdout",
181-
"output_type": "stream",
182-
"text": [
183-
"Using SNPE-C with atomic loss\n",
169+
" Neural network successfully converged after 277 epochs.Using SNPE-C with atomic loss\n",
184170
" Neural network successfully converged after 35 epochs."
185171
]
186172
}
@@ -260,7 +246,7 @@
260246
"name": "python",
261247
"nbconvert_exporter": "python",
262248
"pygments_lexer": "ipython3",
263-
"version": "3.10.14"
249+
"version": "3.12.4"
264250
}
265251
},
266252
"nbformat": 4,

docs/tutorials/03_density_estimators.ipynb

Lines changed: 4 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
"[`nflows`](https://github.com/bayesiains/nflows/) (via `pyknos`) or [`zuko`](https://github.com/probabilists/zuko). \n",
1919
"\n",
2020
"For all options, check the API reference\n",
21-
"[here](https://sbi-dev.github.io/sbi/reference/models/).\n"
21+
"[here](https://sbi.readthedocs.io/en/latest/reference/sbi.models.html).\n"
2222
]
2323
},
2424
{
@@ -113,7 +113,7 @@
113113
"source": [
114114
"It is also possible to pass an `embedding_net` to `posterior_nn()` to automatically\n",
115115
"learn summary statistics from high-dimensional simulation outputs. You can find a more\n",
116-
"detailed tutorial on this in [04_embedding_networks](04_embedding_networks.md).\n"
116+
"detailed tutorial on this in [04_embedding_networks](https://sbi.readthedocs.io/en/latest/tutorials/04_embedding_networks.html).\n"
117117
]
118118
},
119119
{
@@ -137,15 +137,8 @@
137137
"- `loss(input, condition, **kwargs)`: Return the loss for training the density estimator.\n",
138138
"- `sample(sample_shape, condition, **kwargs)`: Return samples from the density estimator.\n",
139139
"\n",
140-
"See more information on the [Reference API page](https://sbi-dev.github.io/sbi/reference/models/#sbi.neural_nets.density_estimators.ConditionalDensityEstimator)."
140+
"See more information on the [Reference API page](https://sbi.readthedocs.io/en/latest/sbi.html)."
141141
]
142-
},
143-
{
144-
"cell_type": "code",
145-
"execution_count": null,
146-
"metadata": {},
147-
"outputs": [],
148-
"source": []
149142
}
150143
],
151144
"metadata": {
@@ -164,7 +157,7 @@
164157
"name": "python",
165158
"nbconvert_exporter": "python",
166159
"pygments_lexer": "ipython3",
167-
"version": "3.10.14"
160+
"version": "3.12.4"
168161
}
169162
},
170163
"nbformat": 4,

docs/tutorials/04_embedding_networks.ipynb

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,7 @@
66
"source": [
77
"# Embedding nets for observations\n",
88
"\n",
9-
"!!! note\n",
10-
" You can find the original version of this notebook at [tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
9+
"Note, you can find the original version of this notebook at [docs/tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
1110
"\n",
1211
"## Introduction\n",
1312
"\n",
@@ -251,8 +250,7 @@
251250
"cell_type": "markdown",
252251
"metadata": {},
253252
"source": [
254-
"!!! note\n",
255-
" See [here](https://github.com/sbi-dev/sbi/blob/main/sbi/neural_nets/embedding_nets.py) for details on all hyperparametes for each available embedding net in `sbi`\n",
253+
"See [here](https://github.com/sbi-dev/sbi/blob/main/sbi/neural_nets/embedding_nets.py) for details on all hyperparametes for each available embedding net in `sbi`\n",
256254
"\n",
257255
"## The inference procedure\n",
258256
"\n",
@@ -432,7 +430,7 @@
432430
"name": "python",
433431
"nbconvert_exporter": "python",
434432
"pygments_lexer": "ipython3",
435-
"version": "3.10.14"
433+
"version": "3.12.4"
436434
}
437435
},
438436
"nbformat": 4,

docs/tutorials/05_conditional_distributions.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"Note, you can find the original version of this notebook at [tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
18+
"Note, you can find the original version of this notebook at [docs/tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
1919
]
2020
},
2121
{

docs/tutorials/06_restriction_estimator.ipynb

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -231,13 +231,7 @@
231231
"text": [
232232
"The `RestrictedPrior` rejected 53.2%\n",
233233
" of prior samples. You will get a speed-up of\n",
234-
" 113.8%.\n"
235-
]
236-
},
237-
{
238-
"name": "stdout",
239-
"output_type": "stream",
240-
"text": [
234+
" 113.8%.\n",
241235
"Simulation outputs: tensor([[ 1.7930, 1.5322],\n",
242236
" [ 1.6024, 1.6551],\n",
243237
" [ 0.0756, 2.4023],\n",
@@ -329,7 +323,7 @@
329323
"name": "python",
330324
"nbconvert_exporter": "python",
331325
"pygments_lexer": "ipython3",
332-
"version": "3.10.14"
326+
"version": "3.12.4"
333327
},
334328
"toc": {
335329
"base_numbering": 1,

docs/tutorials/08_crafting_summary_statistics.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
"metadata": {},
1313
"source": [
1414
"Many simulators produce outputs that are high-dimesional. For example, a simulator might\n",
15-
"generate a time series or an image. In the tutorial [04_embedding_networks](04_embedding_networks.md), we discussed how a\n",
15+
"generate a time series or an image. In the tutorial [04_embedding_networks](https://sbi.readthedocs.io/en/latest/tutorials/04_embedding_networks.html), we discussed how a\n",
1616
"neural networks can be used to learn summary statistics from such data. In this\n",
1717
"notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that\n",
1818
"the choice of summary statistics can be crucial for the performance of the inference\n",
@@ -599,7 +599,7 @@
599599
"name": "python",
600600
"nbconvert_exporter": "python",
601601
"pygments_lexer": "ipython3",
602-
"version": "3.10.14"
602+
"version": "3.12.4"
603603
},
604604
"toc": {
605605
"base_numbering": 1,

docs/tutorials/11_diagnostics_simulation_based_calibration.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
"the estimator should be made subject to several **diagnostic tests**. This needs to be\n",
1111
"performed before being used for inference given the actual observed data. _Posterior\n",
1212
"Predictive Checks_ (see [10_diagnostics_posterior_predictive_checks\n",
13-
"tutorial](10_diagnostics_posterior_predictive_checks.md)) provide one way to \"critique\" a trained\n",
13+
"tutorial](http://localhost:8000/tutorials/10_diagnostics_posterior_predictive_checks.html)) provide one way to \"critique\" a trained\n",
1414
"estimator based on its predictive performance. Another important approach to such\n",
1515
"diagnostics is simulation-based calibration as developed by [Cook et al,\n",
1616
"2006](https://www.tandfonline.com/doi/abs/10.1198/106186006X136976) and [Talts et al,\n",
@@ -909,7 +909,7 @@
909909
],
910910
"metadata": {
911911
"kernelspec": {
912-
"display_name": "sbi-dev",
912+
"display_name": "Python 3 (ipykernel)",
913913
"language": "python",
914914
"name": "python3"
915915
},

0 commit comments

Comments
 (0)