@@ -8,18 +8,29 @@ The activation function incorporates non-linearity properties into the neural ne
8
8
9
9
PaddlePaddle supports most of the activation functions, including:
10
10
11
+ * :ref: `api_paddle_nn_functional_celu `
11
12
* :ref: `api_paddle_nn_functional_elu `
12
13
* :ref: `api_paddle_exp `
13
- * :ref: `api_paddle_nn_functional_hardsigmoid `
14
+ * :ref: `api_paddle_nn_functional_glu `
15
+ * :ref: `api_paddle_nn_functional_gumbel_softmax `
14
16
* :ref: `api_paddle_nn_functional_hardshrink `
17
+ * :ref: `api_paddle_nn_functional_hardsigmoid `
18
+ * :ref: `api_paddle_nn_functional_hardswish `
19
+ * :ref: `api_paddle_nn_functional_hardtanh `
15
20
* :ref: `api_paddle_nn_functional_leaky_relu `
16
21
* :ref: `api_paddle_nn_functional_log_sigmoid `
22
+ * :ref: `api_paddle_nn_functional_log_softmax `
17
23
* :ref: `api_paddle_nn_functional_maxout `
24
+ * :ref: `api_paddle_nn_functional_mish `
18
25
* :ref: `api_paddle_pow `
19
26
* :ref: `api_paddle_nn_functional_prelu `
20
27
* :ref: `api_paddle_nn_functional_relu `
21
28
* :ref: `api_paddle_nn_functional_relu6 `
29
+ * :ref: `api_paddle_nn_functional_rrelu `
30
+ * :ref: `api_paddle_nn_functional_selu `
22
31
* :ref: `api_paddle_tensor_sigmoid `
32
+ * :ref: `api_paddle_nn_functional_silu `
33
+ * :ref: `api_paddle_nn_functional_softmax `
23
34
* :ref: `api_paddle_nn_functional_softplus `
24
35
* :ref: `api_paddle_nn_functional_softshrink `
25
36
* :ref: `api_paddle_nn_functional_softsign `
0 commit comments