Skip to content

Commit 83ffb3d

Browse files
Fix broken links in some tutorials (#1541)
1 parent 1928f01 commit 83ffb3d

10 files changed

+113
-130
lines changed

docs/advanced_tutorials/03_density_estimators.ipynb

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
"[`nflows`](https://github.com/bayesiains/nflows/) (via `pyknos`) or [`zuko`](https://github.com/probabilists/zuko). \n",
1919
"\n",
2020
"For all options, check the API reference\n",
21-
"[here](https://sbi.readthedocs.io/en/latest/reference/sbi.models.html).\n"
21+
"[here](https://sbi.readthedocs.io/en/latest/sbi.html#neural-nets)."
2222
]
2323
},
2424
{
@@ -113,7 +113,7 @@
113113
"source": [
114114
"It is also possible to pass an `embedding_net` to `posterior_nn()` to automatically\n",
115115
"learn summary statistics from high-dimensional simulation outputs. You can find a more\n",
116-
"detailed tutorial on this in [04_embedding_networks](https://sbi.readthedocs.io/en/latest/tutorials/04_embedding_networks.html).\n"
116+
"detailed tutorial on this in [04_embedding_networks](https://sbi.readthedocs.io/en/latest/how_to_guide/04_embedding_networks.html).\n"
117117
]
118118
},
119119
{
@@ -131,13 +131,11 @@
131131
"\n",
132132
"For this, the `density_estimator` argument needs to be a function that takes `theta` and `x` batches as arguments to then construct the density estimator after the first set of simulations was generated. Our factory functions in `sbi/neural_nets/factory.py` return such a function.\n",
133133
"\n",
134-
"The returned `density_estimator` object needs to be a subclass of [`DensityEstimator`](https://sbi-dev.github.io/sbi/reference/#sbi.neural_nets.density_estimators.DensityEstimator), which requires to implement three methods:\n",
134+
"The returned `density_estimator` object needs to be a subclass of [`DensityEstimator`](https://github.com/sbi-dev/sbi/blob/1928f018fa08bb0c5309a34d8e95b9f2916b20a5/sbi/neural_nets/estimators/base.py#L11), which requires to implement three methods:\n",
135135
" \n",
136136
"- `log_prob(input, condition, **kwargs)`: Return the log probabilities of the inputs given a condition or multiple i.e. batched conditions.\n",
137137
"- `loss(input, condition, **kwargs)`: Return the loss for training the density estimator.\n",
138-
"- `sample(sample_shape, condition, **kwargs)`: Return samples from the density estimator.\n",
139-
"\n",
140-
"See more information on the [Reference API page](https://sbi.readthedocs.io/en/latest/sbi.html)."
138+
"- `sample(sample_shape, condition, **kwargs)`: Return samples from the density estimator."
141139
]
142140
}
143141
],

docs/advanced_tutorials/04_embedding_networks.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -221,7 +221,7 @@
221221
"\n",
222222
"- Fully-connected multi-layer perceptron\n",
223223
"- Convolutional neural network (1D and 2D convolutions)\n",
224-
"- Permutation-invariant neural network (for trial-based data, see [here](https://sbi-dev.github.io/sbi/latest/tutorials/12_iid_data_and_permutation_invariant_embeddings/))\n",
224+
"- Permutation-invariant neural network (for trial-based data, see [here](https://sbi.readthedocs.io/en/latest/how_to_guide/08_permutation_invariant_embeddings.html))\n",
225225
"\n",
226226
"In the example considered here, the most appropriate `embedding_net` would be a CNN for two-dimensional images. We can setup it as per:\n"
227227
]

docs/advanced_tutorials/05_conditional_distributions.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -291,7 +291,7 @@
291291
"source": [
292292
"## Sampling conditional distributions\n",
293293
"\n",
294-
"So far, we have demonstrated how one can plot 2D conditional distributions with `conditional_pairplot()` and how one can compute the pairwise conditional correlation coefficient with `conditional_corrcoeff()`. In some cases, it can be useful to keep a subset of parameters fixed and to vary **more than two** parameters. This can be done by sampling the conditonal posterior $p(\\theta_i | \\theta_{j \\neq i}, x_o)$. As of `sbi` `v0.18.0`, this functionality requires using the [sampler interface](https://sbi-dev.github.io/sbi/latest/tutorials/09_sampler_interface/). In this tutorial, we demonstrate this functionality on a linear gaussian simulator with four parameters. We would like to fix the forth parameter to $\\theta_4=0.2$ and sample the first three parameters given that value, i.e. we want to sample $p(\\theta_1, \\theta_2, \\theta_3 | \\theta_4 = 0.2, x_o)$. For an application in neuroscience, see [Deistler, Gonçalves, Macke, 2021](https://www.biorxiv.org/content/10.1101/2021.07.30.454484v4.abstract).\n"
294+
"So far, we have demonstrated how one can plot 2D conditional distributions with `conditional_pairplot()` and how one can compute the pairwise conditional correlation coefficient with `conditional_corrcoeff()`. In some cases, it can be useful to keep a subset of parameters fixed and to vary **more than two** parameters. This can be done by sampling the conditonal posterior $p(\\theta_i | \\theta_{j \\neq i}, x_o)$. As of `sbi` `v0.18.0`, this functionality requires using the [sampler interface](https://sbi.readthedocs.io/en/latest/how_to_guide/09_sampler_interface.html). In this tutorial, we demonstrate this functionality on a linear gaussian simulator with four parameters. We would like to fix the forth parameter to $\\theta_4=0.2$ and sample the first three parameters given that value, i.e. we want to sample $p(\\theta_1, \\theta_2, \\theta_3 | \\theta_4 = 0.2, x_o)$. For an application in neuroscience, see [Deistler, Gonçalves, Macke, 2021](https://www.biorxiv.org/content/10.1101/2021.07.30.454484v4.abstract).\n"
295295
]
296296
},
297297
{
@@ -362,7 +362,7 @@
362362
"cell_type": "markdown",
363363
"metadata": {},
364364
"source": [
365-
"Now we want to build the conditional potential (please read through the tutorial [09_sampler_interface](https://sbi-dev.github.io/sbi/dev/tutorials/09_sampler_interface/) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
365+
"Now we want to build the conditional potential (please read through [this tutorial](https://sbi.readthedocs.io/en/latest/how_to_guide/09_sampler_interface.html) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
366366
]
367367
},
368368
{

docs/advanced_tutorials/13_diagnostics_lc2st.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
" \n",
1212
"*Posterior Predictive Checks* (see [this tutorial](https://sbi.readthedocs.io/en/latest/advanced_tutorials/10_diagnostics_posterior_predictive_checks.html)) provide one way to \"critique\" a trained estimator via its predictive performance. \n",
1313
" \n",
14-
"Another approach is *Simulation-Based Calibration* (SBC, see [this tutorial](https://sbi.readthedocs.io/en/latest/advanced_tutorials/11_diagnostics_simulation_based_calibration.html)). SBC evaluates whether the estimated posterior is balanced, i.e., neither over-confident nor under-confident. These checks are performed ***in expectation (on average) over the observation space***, i.e. they are performed on a set of $(\\theta,x)$ pairs sampled from the joint distribution over simulator parameters $\\theta$ and corresponding observations $x$. As such, SBC is a ***global validation method*** that can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, *this is no guarantee that the posterior estimation is working*.\n",
14+
"Another approach is *Simulation-Based Calibration* (SBC, see [this tutorial](https://sbi.readthedocs.io/en/latest/how_to_guide/16_sbc.html)). SBC evaluates whether the estimated posterior is balanced, i.e., neither over-confident nor under-confident. These checks are performed ***in expectation (on average) over the observation space***, i.e. they are performed on a set of $(\\theta,x)$ pairs sampled from the joint distribution over simulator parameters $\\theta$ and corresponding observations $x$. As such, SBC is a ***global validation method*** that can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, *this is no guarantee that the posterior estimation is working*.\n",
1515
"\n",
1616
"**Local Classifier Two-Sample Tests** ($\\ell$-C2ST) as developed by [Linhart et al, 2023](https://arxiv.org/abs/2306.03580) present a new ***local validation method*** that allows to evaluate the correctness of the posterior estimator ***at a fixed observation***, i.e. they work on a single $(\\theta,x)$ pair. They provide necessary *and sufficient* conditions for the validity of the SBI algorithm, as well as easy-to-interpret qualitative and quantitative diagnostics. \n",
1717
" \n",

0 commit comments

Comments
 (0)