Skip to content

Commit 831b6d8

Browse files
authored
Update lotka.volterra.rst
Fixing a typo where it said the neural network was being created with 50 neurons, but the code actually creates it with 64.
1 parent a60cd74 commit 831b6d8

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/demos/pinn_forward/lotka.volterra.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ We have 3000 training residual points inside the domain and 2 points on the boun
8282
initializer = "Glorot normal"
8383
net = dde.nn.FNN(layer_size, activation, initializer)
8484
85-
This is a neural network of depth 7 with 6 hidden layers of width 50. We use :math:`\tanh` as the activation function. Since we expect to have periodic behavior in the Lotka-Volterra equation, we add a feature layer with :math:`\sin(kt)`. This forces the prediction to be periodic and therefore more accurate.
85+
This is a neural network of depth 7 with 6 hidden layers of width 64. We use :math:`\tanh` as the activation function. Since we expect to have periodic behavior in the Lotka-Volterra equation, we add a feature layer with :math:`\sin(kt)`. This forces the prediction to be periodic and therefore more accurate.
8686

8787
.. code-block:: python
8888

0 commit comments

Comments
 (0)