File tree Expand file tree Collapse file tree 1 file changed +2
-2
lines changed
Expand file tree Collapse file tree 1 file changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -11,8 +11,8 @@ Description: Training of neural networks for classification and regression tasks
1111 related plotting functions. Multiple activation functions are supported,
1212 including tanh, relu, step and ramp. For the use of the step and ramp
1313 activation functions in detecting anomalies using autoencoders, see
14- Hawkins et al. (2002) <doi.org/ 10.1007/3-540-46145-0_17>. Furthermore,
15- several loss functions are supporterd , including robust ones such as Huber
14+ Hawkins et al. (2002) <doi: 10.1007/3-540-46145-0_17>. Furthermore,
15+ several loss functions are supported , including robust ones such as Huber
1616 and pseudo-Huber loss, as well as L1 and L2 regularization. The possible
1717 options for optimization algorithms are RMSprop, Adam and SGD with momentum.
1818 The package contains a vectorized C++ implementation that facilitates
You can’t perform that action at this time.
0 commit comments