Skip to content

Commit 1cb9907

Browse files
authored
将激活函数与源代码对齐,activation源代码链接:https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/nn/functional/activation.py (#7305)
1 parent 048b6f8 commit 1cb9907

File tree

2 files changed

+24
-2
lines changed

2 files changed

+24
-2
lines changed

docs/api_guides/low_level/layers/activations.rst

+12-1
Original file line numberDiff line numberDiff line change
@@ -8,18 +8,29 @@
88

99
PaddlePaddle 对大部分的激活函数进行了支持,其中有:
1010

11+
* :ref:`cn_api_paddle_nn_functional_celu`
1112
* :ref:`cn_api_paddle_nn_functional_elu`
1213
* :ref:`cn_api_paddle_exp`
13-
* :ref:`cn_api_paddle_nn_functional_hardsigmoid`
14+
* :ref:`cn_api_paddle_nn_functional_glu`
15+
* :ref:`cn_api_paddle_nn_functional_gumbel_softmax`
1416
* :ref:`cn_api_paddle_nn_functional_hardshrink`
17+
* :ref:`cn_api_paddle_nn_functional_hardsigmoid`
18+
* :ref:`cn_api_paddle_nn_functional_hardswish`
19+
* :ref:`cn_api_paddle_nn_functional_hardtanh`
1520
* :ref:`cn_api_paddle_nn_functional_leaky_relu`
1621
* :ref:`cn_api_paddle_nn_functional_log_sigmoid`
22+
* :ref:`cn_api_paddle_nn_functional_log_softmax`
1723
* :ref:`cn_api_paddle_nn_functional_maxout`
24+
* :ref:`cn_api_paddle_nn_functional_mish`
1825
* :ref:`cn_api_paddle_pow`
1926
* :ref:`cn_api_paddle_nn_functional_prelu`
2027
* :ref:`cn_api_paddle_nn_functional_relu`
2128
* :ref:`cn_api_paddle_nn_functional_relu6`
29+
* :ref:`cn_api_paddle_nn_functional_rrelu`
30+
* :ref:`cn_api_paddle_nn_functional_selu`
2231
* :ref:`cn_api_paddle_nn_functional_sigmoid`
32+
* :ref:`cn_api_paddle_nn_functional_silu`
33+
* :ref:`cn_api_paddle_nn_functional_softmax`
2334
* :ref:`cn_api_paddle_nn_functional_softplus`
2435
* :ref:`cn_api_paddle_nn_functional_softshrink`
2536
* :ref:`cn_api_paddle_nn_functional_softsign`

docs/api_guides/low_level/layers/activations_en.rst

+12-1
Original file line numberDiff line numberDiff line change
@@ -8,18 +8,29 @@ The activation function incorporates non-linearity properties into the neural ne
88

99
PaddlePaddle supports most of the activation functions, including:
1010

11+
* :ref:`api_paddle_nn_functional_celu`
1112
* :ref:`api_paddle_nn_functional_elu`
1213
* :ref:`api_paddle_exp`
13-
* :ref:`api_paddle_nn_functional_hardsigmoid`
14+
* :ref:`api_paddle_nn_functional_glu`
15+
* :ref:`api_paddle_nn_functional_gumbel_softmax`
1416
* :ref:`api_paddle_nn_functional_hardshrink`
17+
* :ref:`api_paddle_nn_functional_hardsigmoid`
18+
* :ref:`api_paddle_nn_functional_hardswish`
19+
* :ref:`api_paddle_nn_functional_hardtanh`
1520
* :ref:`api_paddle_nn_functional_leaky_relu`
1621
* :ref:`api_paddle_nn_functional_log_sigmoid`
22+
* :ref:`api_paddle_nn_functional_log_softmax`
1723
* :ref:`api_paddle_nn_functional_maxout`
24+
* :ref:`api_paddle_nn_functional_mish`
1825
* :ref:`api_paddle_pow`
1926
* :ref:`api_paddle_nn_functional_prelu`
2027
* :ref:`api_paddle_nn_functional_relu`
2128
* :ref:`api_paddle_nn_functional_relu6`
29+
* :ref:`api_paddle_nn_functional_rrelu`
30+
* :ref:`api_paddle_nn_functional_selu`
2231
* :ref:`api_paddle_tensor_sigmoid`
32+
* :ref:`api_paddle_nn_functional_silu`
33+
* :ref:`api_paddle_nn_functional_softmax`
2334
* :ref:`api_paddle_nn_functional_softplus`
2435
* :ref:`api_paddle_nn_functional_softshrink`
2536
* :ref:`api_paddle_nn_functional_softsign`

0 commit comments

Comments
 (0)