You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Add readme note on Julia version
* Bump Turing to 0.35
* Update minimum supported Julia version
* Remove unnecessary version qualifier
* Remove Tracker and replace with Mooncake, except in BNN doc
* Use Mooncake in BNN doc (#521)
* Fix BNN doc to work with Mooncake
Now we extract the parameter samples from the sampled chain as `θ` (this is of size `5000 x 20` where `5000` is the number of iterations and `20` is the number of parameters).
Copy file name to clipboardExpand all lines: tutorials/10-bayesian-differential-equations/index.qmd
+1-1
Original file line number
Diff line number
Diff line change
@@ -320,7 +320,7 @@ More theoretical details on these methods can be found at: https://docs.sciml.ai
320
320
While these sensitivity analysis methods may seem complicated, using them is dead simple.
321
321
Here is a version of the Lotka-Volterra model using adjoint sensitivities.
322
322
323
-
All we have to do is switch the AD backend to one of the adjoint-compatible backends (ReverseDiff, Tracker, or Zygote)!
323
+
All we have to do is switch the AD backend to one of the adjoint-compatible backends (ReverseDiff or Zygote)!
324
324
Notice that on this model adjoints are slower.
325
325
This is because adjoints have a higher overhead on small parameter models and therefore we suggest using these methods only for models with around 100 parameters or more.
326
326
For more details, see https://arxiv.org/abs/1812.01892.
Copy file name to clipboardExpand all lines: tutorials/docs-00-getting-started/index.qmd
+1-1
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ Pkg.instantiate();
16
16
17
17
To use Turing, you need to install Julia first and then install Turing.
18
18
19
-
You will need to install Julia 1.7 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
19
+
You will need to install Julia 1.10 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
20
20
21
21
Turing is officially registered in the [Julia General package registry](https://github.com/JuliaRegistries/General), which means that you can install a stable version of Turing by running the following in the Julia REPL:
Copy file name to clipboardExpand all lines: tutorials/docs-10-using-turing-autodiff/index.qmd
+4-4
Original file line number
Diff line number
Diff line change
@@ -12,9 +12,8 @@ Pkg.instantiate();
12
12
13
13
## Switching AD Modes
14
14
15
-
Turing currently supports four automatic differentiation (AD) backends for sampling: [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) for forward-mode AD; and [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl), [Zygote](https://github.com/FluxML/Zygote.jl), and [Tracker](https://github.com/FluxML/Tracker.jl) for reverse-mode AD.
16
-
While `Tracker` is still available, its use is discouraged due to a lack of active maintenance.
17
-
`ForwardDiff` is automatically imported by Turing. To utilize `Zygote` or `ReverseDiff` for AD, users must explicitly import them with `using Zygote` or `using ReverseDiff`, alongside `using Turing`.
15
+
Turing currently supports four automatic differentiation (AD) backends for sampling: [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) for forward-mode AD; and [Mooncake](https://github.com/compintell/Mooncake.jl), [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl), and [Zygote](https://github.com/FluxML/Zygote.jl) for reverse-mode AD.
16
+
`ForwardDiff` is automatically imported by Turing. To utilize `Mooncake`, `Zygote`, or `ReverseDiff` for AD, users must explicitly import them with `import Mooncake`, `import Zygote` or `import ReverseDiff`, alongside `using Turing`.
18
17
19
18
As of Turing version v0.30, the global configuration flag for the AD backend has been removed in favour of [`AdTypes.jl`](https://github.com/SciML/ADTypes.jl), allowing users to specify the AD backend for individual samplers independently.
20
19
Users can pass the `adtype` keyword argument to the sampler constructor to select the desired AD backend, with the default being `AutoForwardDiff(; chunksize=0)`.
@@ -69,7 +68,8 @@ Generally, reverse-mode AD, for instance `ReverseDiff`, is faster when sampling
69
68
If the differentiation method is not specified in this way, Turing will default to using whatever the global AD backend is.
70
69
Currently, this defaults to `ForwardDiff`.
71
70
72
-
The most reliable way to ensure you are using the fastest AD that works for your problem is to benchmark them using `TuringBenchmarking`:
71
+
The most reliable way to ensure you are using the fastest AD that works for your problem is to benchmark them using [`TuringBenchmarking`](https://github.com/TuringLang/TuringBenchmarking.jl):
Copy file name to clipboardExpand all lines: tutorials/docs-12-using-turing-guide/index.qmd
+2-2
Original file line number
Diff line number
Diff line change
@@ -166,7 +166,7 @@ The `chains` variable now contains a `Chains` object which can be indexed by cha
166
166
167
167
#### Multithreaded sampling
168
168
169
-
If you wish to perform multithreaded sampling and are running Julia 1.3 or greater, you can call `sample` with the following signature:
169
+
If you wish to perform multithreaded sampling, you can call `sample` with the following signature:
170
170
171
171
```{julia}
172
172
#| eval: false
@@ -514,7 +514,7 @@ ForwardDiff (Turing's default AD backend) uses forward-mode chunk-wise AD. The c
514
514
515
515
#### AD Backend
516
516
517
-
Turing supports four automatic differentiation (AD) packages in the back end during sampling. The default AD backend is [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) for forward-mode AD. Three reverse-mode AD backends are also supported, namely [Tracker](https://github.com/FluxML/Tracker.jl), [Zygote](https://github.com/FluxML/Zygote.jl) and [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl). `Zygote` and `ReverseDiff`are supported optionally if explicitly loaded by the user with `using Zygote` or `using ReverseDiff` next to `using Turing`.
517
+
Turing supports four automatic differentiation (AD) packages in the back end during sampling. The default AD backend is [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) for forward-mode AD. Three reverse-mode AD backends are also supported, namely [Mooncake](https://github.com/compintell/Mooncake.jl), [Zygote](https://github.com/FluxML/Zygote.jl) and [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl). `Mooncake`, `Zygote`, and `ReverseDiff`also require the user to explicitly load them using `import Mooncake`, `import Zygote`, or `import ReverseDiff` next to `using Turing`.
518
518
519
519
For more information on Turing's automatic differentiation backend, please see the [Automatic Differentiation]({{<metausing-turing-autodiff>}}) article.
Copy file name to clipboardExpand all lines: tutorials/docs-13-using-turing-performance-tips/index.qmd
+6-5
Original file line number
Diff line number
Diff line change
@@ -43,19 +43,20 @@ end
43
43
## Choose your AD backend
44
44
45
45
Automatic differentiation (AD) makes it possible to use modern, efficient gradient-based samplers like NUTS and HMC, and that means a good AD system is incredibly important. Turing currently
46
-
supports several AD backends, including [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) (the default), [Zygote](https://github.com/FluxML/Zygote.jl),
47
-
[ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl), and [Tracker](https://github.com/FluxML/Tracker.jl). Experimental support is also available for
48
-
[Tapir](https://github.com/withbayes/Tapir.jl).
46
+
supports several AD backends, including [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) (the default),
For many common types of models, the default ForwardDiff backend performs great, and there is no need to worry about changing it. However, if you need more speed, you can try
51
52
different backends via the standard [ADTypes](https://github.com/SciML/ADTypes.jl) interface by passing an `AbstractADType` to the sampler with the optional `adtype` argument, e.g.
52
53
`NUTS(adtype = AutoZygote())`. See [Automatic Differentiation]({{<metausing-turing-autodiff>}}) for details. Generally, `adtype = AutoForwardDiff()` is likely to be the fastest and most reliable for models with
53
54
few parameters (say, less than 20 or so), while reverse-mode backends such as `AutoZygote()` or `AutoReverseDiff()` will perform better for models with many parameters or linear algebra
54
55
operations. If in doubt, it's easy to try a few different backends to see how they compare.
55
56
56
-
### Special care for Zygote and Tracker
57
+
### Special care for Zygote
57
58
58
-
Note that Zygote and Tracker will not perform well if your model contains `for`-loops, due to the way reverse-mode AD is implemented in these packages. Zygote also cannot differentiate code
59
+
Note that Zygote will not perform well if your model contains `for`-loops, due to the way reverse-mode AD is implemented in these packages. Zygote also cannot differentiate code
59
60
that contains mutating operations. If you can't implement your model without `for`-loops or mutation, `ReverseDiff` will be a better, more performant option. In general, though,
60
61
vectorized operations are still likely to perform best.
0 commit comments