You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- the joint distribution for a moderate numbers of parameters (<40),
7
-
- it is not inherently restricted to unimodal distributions.
3
+
`CalibrateEmulateSample.jl` solves parameter estimation problems using accelerated (and approximate) Bayesian inversion. The framework can be applied currently to learn the joint distribution for flexible numbers of parameters, it is not inherently restricted to unimodal distributions. Please see the [example below!](@ref inv-prob-front).
8
4
9
5
It can be used with computer models that:
10
6
- can be noisy or chaotic,
@@ -19,8 +15,17 @@ y = \mathcal{G}(\theta) + \eta,
19
15
```
20
16
where the noise ``\eta`` is drawn from a $d$-dimensional Gaussian with distribution ``\mathcal{N}(0, \Gamma_y)``.
21
17
18
+
### Quick links
19
+
20
+
-[How do I build prior distributions?](https://clima.github.io/EnsembleKalmanProcesses.jl/dev/parameter_distributions/)
21
+
-[How do I build good observational noise covariances](https://clima.github.io/EnsembleKalmanProcesses.jl/dev/observations/)
22
+
-[What ensemble size should I take? Which process should I use? What is the recommended configuration?](https://clima.github.io/EnsembleKalmanProcesses.jl/dev/defaults/)
23
+
-[Where can I walk through the simple example?](@ref sinusoid-example)
24
+
-[What is the `EnsembleKalmanProcesses.jl` package?](@ref calibrate)
25
+
-[What are the recommendations/defaults for dimension reduction?](@ref data-proc)
26
+
-[How to I plot or interpret the posterior distribution?](@ref get-posterior)
22
27
23
-
### The inverse problem
28
+
### [The inverse problem](@id inv-prob-front)
24
29
25
30
Given an observation ``y``, the computer model ``\mathcal{G}``, the observational noise ``\Gamma_y``, and some broad prior information on ``\theta``, we return the joint distribution of a data-informed distribution for "``\theta`` given ``y``".
Copy file name to clipboardExpand all lines: docs/src/sample.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ The "sample" part of CES refers to exact sampling from the emulated posterior, i
8
8
Carlo algorithm](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo) (MCMC). Within this paradigm, we want to provide the flexibility to use multiple sampling algorithms; the approach we take is to use the general-purpose [AbstractMCMC.jl](https://turing.ml/dev/docs/for-developers/interface) API, provided by the [Turing.jl](https://turing.ml/dev/) probabilistic programming framework.
9
9
10
10
11
-
## User interface
11
+
## [User interface](@id sample-ui)
12
12
13
13
We briefly outline an instance of how one sets up and uses MCMC within the CES package. The user first loads the MCMC module, and provides one of the Protocols (i.e. how one wishes to generate sampling proposals)
0 commit comments