@@ -188,7 +188,7 @@ For ordinary pure functions like `(x,y) -> (x*y)`, this `∂f(x,y)/∂f` would a
188188 depends on ` θ ` .
189189
190190``` @raw html
191- <h3><img src="https://github.com/FluxML/Optimisers.jl/blob/master/docs/src/ assets/logo .png?raw=true " width="40px"/><a href="https://github.com/FluxML/Zygote.jl">Zygote.jl</a></h3>
191+ <h3><img src="../../../ assets/zygote-crop .png" width="40px"/> <a href="https://github.com/FluxML/Zygote.jl">Zygote.jl</a></h3>
192192```
193193
194194Flux's [ ` gradient ` ] ( @ref ) function by default calls a companion packages called [ Zygote] ( https://github.com/FluxML/Zygote.jl ) .
@@ -327,7 +327,9 @@ grad = Flux.gradient(|>, [1f0], model1)[2]
327327This gradient is starting to be a complicated nested structure.
328328But it works just like before: ` grad.outer.inner.W ` corresponds to ` model1.outer.inner.W ` .
329329
330- ### <img src =" https://github.com/FluxML/Optimisers.jl/blob/master/docs/src/assets/logo.png?raw=true " width =" 40px " />   ; [ Flux's layers] ( man-layers )
330+ ``` @raw html
331+ <h3><img src="https://github.com/FluxML/Optimisers.jl/blob/master/docs/src/assets/logo.png?raw=true" width="40px"/> <a href="../../../reference/models/layers/">Flux's layers</a></h3>
332+ ```
331333
332334Rather than define everything from scratch every time, Flux provides a library of
333335commonly used layers. The same model could be defined:
@@ -359,14 +361,14 @@ How does this `model2` differ from the `model1` we had before?
359361 Calling [ ` Flux.@layer Layer ` ] (@ref Flux.@layer ) will add this, and some other niceties.
360362
361363If what you need isn't covered by Flux's built-in layers, it's easy to write your own.
362- There are more details [ later] ( man-advanced ) , but the steps are invariably those shown for ` struct Layer ` above:
364+ There are more details [ later] (@ ref man-advanced), but the steps are invariably those shown for ` struct Layer ` above:
3633651 . Define a ` struct ` which will hold the parameters.
3643662 . Make it callable, to define how it uses them to transform the input ` x `
3653673 . Define a constructor which initialises the parameters (if the default constructor doesn't do what you want).
3663684 . Annotate with ` @layer ` to opt-in to pretty printing, and other enhacements.
367369
368370``` @raw html
369- <h3><img src="https://github.com/FluxML/Optimisers.jl/blob/master/docs/src/assets/logo.png?raw=true" width="40px"/><a href="https://github.com/FluxML/Functors.jl">Functors.jl</a></h3>
371+ <h3><img src="https://github.com/FluxML/Optimisers.jl/blob/master/docs/src/assets/logo.png?raw=true" width="40px"/> <a href="https://github.com/FluxML/Functors.jl">Functors.jl</a></h3>
370372```
371373
372374To deal with such nested structures, Flux relies heavily on an associated package
@@ -399,7 +401,7 @@ of the output -- it must be a number, not a vector. Adjusting the parameters
399401to make this smaller won't lead us anywhere interesting. Instead, we should minimise
400402some * loss function* which compares the actual output to our desired output.
401403
402- Perhaps the simplest example is curve fitting. The [ previous page] ( man-overview ) fitted
404+ Perhaps the simplest example is curve fitting. The [ previous page] (@ ref man-overview) fitted
403405a linear function to data. With out two-layer ` model2 ` , we can fit a nonlinear function.
404406For example, let us use ` f(x) = 2x - x^3 ` evaluated at some points ` x in -2:0.1:2 ` as the data,
405407and adjust the parameters of ` model2 ` from above so that its output is similar.
@@ -424,6 +426,6 @@ plot(x -> 2x-x^3, -2, 2, label="truth")
424426scatter!(x -> model2([x]), - 2 : 0.1f0 : 2 , label= " fitted" )
425427```
426428
427- If this general idea is unfamiliar, you may want the [ tutorial on linear regression] ( man-linear-regression ) .
429+ If this general idea is unfamiliar, you may want the [ tutorial on linear regression] (@ ref man-linear-regression).
428430
429- More detail about what exactly the function ` train! ` is doing, and how to use rules other than simple [ ` Descent ` ] (@ref Optimisers.Descent), is what the next page in this guide is about: [ training] ( man-training ) .
431+ More detail about what exactly the function ` train! ` is doing, and how to use rules other than simple [ ` Descent ` ] (@ref Optimisers.Descent), is what the next page in this guide is about: [ training] (@ ref man-training).
0 commit comments