You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/tutorials/00_getting_started.ipynb
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@
11
11
"cell_type": "markdown",
12
12
"metadata": {},
13
13
"source": [
14
-
"Note, you can find the original version of this notebook at [/tutorials/00_getting_started.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started.ipynb) in the `sbi` repository."
14
+
"Note, you can find the original version of this notebook at [/docs/tutorials/00_getting_started.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/00_getting_started.ipynb) in the `sbi` repository."
15
15
]
16
16
},
17
17
{
@@ -60,7 +60,7 @@
60
60
"2. a candidate (mechanistic) model - _the simulator_ \n",
61
61
"3. prior knowledge or constraints on model parameters - _the prior_\n",
62
62
"\n",
63
-
"If you are new to simulation-based inference, please first read the information on the [homepage of the website](https://sbi-dev.github.io/sbi/) to familiarise with the motivation and relevant terms."
63
+
"If you are new to simulation-based inference, please first read the information on the [homepage of the website](https://sbi.readthedocs.io/en/latest/index.html) to familiarise with the motivation and relevant terms."
64
64
]
65
65
},
66
66
{
@@ -138,9 +138,9 @@
138
138
"cell_type": "markdown",
139
139
"metadata": {},
140
140
"source": [
141
-
"> Note: In `sbi` version 0.23.0, we renamed all inference classes from, e.g., `SNPE`, to `NPE` (i.e., we removed the `S` prefix). The functionality of the classes remains the same. The `NPE` class handles both the amortized (as shown in this tutorial) and sequential (as shown [here](https://sbi-dev.github.io/sbi/tutorial/02_multiround_inference/)) versions of neural posterior estimation. An alias for `SNPE` still exists for backwards compatibility.\n",
141
+
"> Note: In `sbi` version 0.23.0, we renamed all inference classes from, e.g., `SNPE`, to `NPE` (i.e., we removed the `S` prefix). The functionality of the classes remains the same. The `NPE` class handles both the amortized (as shown in this tutorial) and sequential (as shown [here](https://sbi.readthedocs.io/en/latest/tutorials/02_multiround_inference.html)) versions of neural posterior estimation. An alias for `SNPE` still exists for backwards compatibility.\n",
142
142
"\n",
143
-
"> Note: This is where you could specify an alternative inference object such as NRE for ratio estimation or NLE for likelihood estimation. Here, you can see [all implemented methods](https://sbi-dev.github.io/sbi/latest/tutorials/16_implemented_methods/)."
143
+
"> Note: This is where you could specify an alternative inference object such as NRE for ratio estimation or NLE for likelihood estimation. Here, you can see [all implemented methods](https://sbi.readthedocs.io/en/latest/tutorials/16_implemented_methods.html)."
144
144
]
145
145
},
146
146
{
@@ -456,11 +456,11 @@
456
456
"## Next steps\n",
457
457
"\n",
458
458
"To learn more about the capabilities of `sbi`, you can head over to the tutorial\n",
459
-
"[01_gaussian_amortized](01_gaussian_amortized.md), for inferring parameters for multiple\n",
459
+
"[01_gaussian_amortized](https://sbi.readthedocs.io/en/latest/tutorials/01_gaussian_amortized.html), for inferring parameters for multiple\n",
460
460
"observations without retraining.\n",
461
461
"\n",
462
462
"Alternatively, for an example with an __actual__ simulator, you can read our example\n",
463
-
"for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](Example_00_HodgkinHuxleyModel.md)."
463
+
"for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](https://sbi.readthedocs.io/en/latest/tutorials/Example_00_HodgkinHuxleyModel.html)."
Copy file name to clipboardExpand all lines: docs/tutorials/01_gaussian_amortized.ipynb
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@
11
11
"cell_type": "markdown",
12
12
"metadata": {},
13
13
"source": [
14
-
"Note, you can find the original version of this notebook at [tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
14
+
"Note, you can find the original version of this notebook at [docs/tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
15
15
]
16
16
},
17
17
{
@@ -20,7 +20,7 @@
20
20
"source": [
21
21
"In this tutorial, we introduce **amortization** that is the capability to evaluate the posterior for different observations without having to re-run inference.\n",
22
22
"\n",
23
-
"We will demonstrate how `sbi` can infer an amortized posterior for the illustrative linear Gaussian example introduced in [Getting Started](https://sbi-dev.github.io/sbi/latest/tutorials/00_getting_started/), that takes in 3 parameters ($\\theta$). "
23
+
"We will demonstrate how `sbi` can infer an amortized posterior for the illustrative linear Gaussian example introduced in [Getting Started](https://sbi.readthedocs.io/en/latest/tutorials/00_getting_started.html), that takes in 3 parameters ($\\theta$). "
24
24
]
25
25
},
26
26
{
@@ -214,10 +214,10 @@
214
214
"cell_type": "markdown",
215
215
"metadata": {},
216
216
"source": [
217
-
"# Next steps\n",
217
+
"## Next steps\n",
218
218
"\n",
219
219
"Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial\n",
220
-
"[02_multiround_inference](02_multiround_inference.md) which aims to make inference for a single observation more sampling efficient."
220
+
"[02_multiround_inference](https://sbi.readthedocs.io/en/latest/tutorials/02_multiround_inference.html) which aims to make inference for a single observation more sampling efficient."
Copy file name to clipboardExpand all lines: docs/tutorials/02_multiround_inference.ipynb
+4-18Lines changed: 4 additions & 18 deletions
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@
17
17
"cell_type": "markdown",
18
18
"metadata": {},
19
19
"source": [
20
-
"Note, you can find the original version of this notebook at [tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
20
+
"Note, you can find the original version of this notebook at [docs/tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
21
21
]
22
22
},
23
23
{
@@ -54,14 +54,7 @@
54
54
"name": "stdout",
55
55
"output_type": "stream",
56
56
"text": [
57
-
" Neural network successfully converged after 196 epochs."
58
-
]
59
-
},
60
-
{
61
-
"name": "stdout",
62
-
"output_type": "stream",
63
-
"text": [
64
-
"Using SNPE-C with atomic loss\n",
57
+
" Neural network successfully converged after 196 epochs.Using SNPE-C with atomic loss\n",
65
58
" Neural network successfully converged after 37 epochs."
66
59
]
67
60
}
@@ -173,14 +166,7 @@
173
166
"name": "stdout",
174
167
"output_type": "stream",
175
168
"text": [
176
-
" Neural network successfully converged after 277 epochs."
177
-
]
178
-
},
179
-
{
180
-
"name": "stdout",
181
-
"output_type": "stream",
182
-
"text": [
183
-
"Using SNPE-C with atomic loss\n",
169
+
" Neural network successfully converged after 277 epochs.Using SNPE-C with atomic loss\n",
184
170
" Neural network successfully converged after 35 epochs."
"It is also possible to pass an `embedding_net` to `posterior_nn()` to automatically\n",
115
115
"learn summary statistics from high-dimensional simulation outputs. You can find a more\n",
116
-
"detailed tutorial on this in [04_embedding_networks](04_embedding_networks.md).\n"
116
+
"detailed tutorial on this in [04_embedding_networks](https://sbi.readthedocs.io/en/latest/tutorials/04_embedding_networks.html).\n"
117
117
]
118
118
},
119
119
{
@@ -137,15 +137,8 @@
137
137
"- `loss(input, condition, **kwargs)`: Return the loss for training the density estimator.\n",
138
138
"- `sample(sample_shape, condition, **kwargs)`: Return samples from the density estimator.\n",
139
139
"\n",
140
-
"See more information on the [Reference API page](https://sbi-dev.github.io/sbi/reference/models/#sbi.neural_nets.density_estimators.ConditionalDensityEstimator)."
140
+
"See more information on the [Reference API page](https://sbi.readthedocs.io/en/latest/sbi.html)."
Copy file name to clipboardExpand all lines: docs/tutorials/04_embedding_networks.ipynb
+3-5Lines changed: 3 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -6,8 +6,7 @@
6
6
"source": [
7
7
"# Embedding nets for observations\n",
8
8
"\n",
9
-
"!!! note\n",
10
-
" You can find the original version of this notebook at [tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
9
+
"Note, you can find the original version of this notebook at [docs/tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
11
10
"\n",
12
11
"## Introduction\n",
13
12
"\n",
@@ -251,8 +250,7 @@
251
250
"cell_type": "markdown",
252
251
"metadata": {},
253
252
"source": [
254
-
"!!! note\n",
255
-
" See [here](https://github.com/sbi-dev/sbi/blob/main/sbi/neural_nets/embedding_nets.py) for details on all hyperparametes for each available embedding net in `sbi`\n",
253
+
"See [here](https://github.com/sbi-dev/sbi/blob/main/sbi/neural_nets/embedding_nets.py) for details on all hyperparametes for each available embedding net in `sbi`\n",
Copy file name to clipboardExpand all lines: docs/tutorials/05_conditional_distributions.ipynb
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@
15
15
"cell_type": "markdown",
16
16
"metadata": {},
17
17
"source": [
18
-
"Note, you can find the original version of this notebook at [tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
18
+
"Note, you can find the original version of this notebook at [docs/tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/docs/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
Copy file name to clipboardExpand all lines: docs/tutorials/08_crafting_summary_statistics.ipynb
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@
12
12
"metadata": {},
13
13
"source": [
14
14
"Many simulators produce outputs that are high-dimesional. For example, a simulator might\n",
15
-
"generate a time series or an image. In the tutorial [04_embedding_networks](04_embedding_networks.md), we discussed how a\n",
15
+
"generate a time series or an image. In the tutorial [04_embedding_networks](https://sbi.readthedocs.io/en/latest/tutorials/04_embedding_networks.html), we discussed how a\n",
16
16
"neural networks can be used to learn summary statistics from such data. In this\n",
17
17
"notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that\n",
18
18
"the choice of summary statistics can be crucial for the performance of the inference\n",
0 commit comments