Skip to content

[Docathon][Update Doc No.21] update activation #7297

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
May 9, 2025
Merged
50 changes: 31 additions & 19 deletions docs/api_guides/low_level/layers/activations.rst
Original file line number Diff line number Diff line change
@@ -1,28 +1,40 @@
.. _api_guide_activations:

####
###################
激活函数
####
###################

激活函数将非线性的特性引入到神经网络当中。

PaddlePaddle Fluid 对大部分的激活函数进行了支持,其中有:

:ref:`cn_api_fluid_layers_relu`, :ref:`cn_api_fluid_layers_tanh`, :ref:`cn_api_fluid_layers_sigmoid`, :ref:`cn_api_fluid_layers_elu`, :ref:`cn_api_fluid_layers_relu6`, :ref:`cn_api_fluid_layers_pow`, :ref:`cn_api_fluid_layers_stanh`, :ref:`cn_api_fluid_layers_hard_sigmoid`, :ref:`cn_api_fluid_layers_swish`, :ref:`cn_api_fluid_layers_prelu`, :ref:`cn_api_fluid_layers_brelu`, :ref:`cn_api_fluid_layers_leaky_relu`, :ref:`cn_api_fluid_layers_soft_relu`, :ref:`cn_api_fluid_layers_thresholded_relu`, :ref:`cn_api_fluid_layers_maxout`, :ref:`cn_api_fluid_layers_logsigmoid`, :ref:`cn_api_fluid_layers_hard_shrink`, :ref:`cn_api_fluid_layers_softsign`, :ref:`cn_api_fluid_layers_softplus`, :ref:`cn_api_fluid_layers_tanh_shrink`, :ref:`cn_api_fluid_layers_softshrink`, :ref:`cn_api_fluid_layers_exp`。


**Fluid 提供了两种使用激活函数的方式:**

- 如果一个层的接口提供了 :code:`act` 变量(默认值为 None),我们可以通过该变量指定该层的激活函数类型。该方式支持常见的激活函数: :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`。

.. code-block:: python

conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu")


- Fluid 为每个 Activation 提供了接口,我们可以显式的对它们进行调用。
PaddlePaddle 对大部分的激活函数进行了支持,其中有:

* :ref:`cn_api_paddle_nn_functional_elu`
* :ref:`cn_api_paddle_exp`
* :ref:`cn_api_paddle_nn_functional_hardsigmoid`
* :ref:`cn_api_paddle_nn_functional_hardshrink`
* :ref:`cn_api_paddle_nn_functional_leaky_relu`
* :ref:`cn_api_paddle_nn_functional_log_sigmoid`
* :ref:`cn_api_paddle_nn_functional_maxout`
* :ref:`cn_api_paddle_pow`
* :ref:`cn_api_paddle_nn_functional_prelu`
* :ref:`cn_api_paddle_nn_functional_relu`
* :ref:`cn_api_paddle_nn_functional_relu6`
* :ref:`cn_api_paddle_nn_functional_sigmoid`
* :ref:`cn_api_paddle_nn_functional_softplus`
* :ref:`cn_api_paddle_nn_functional_softshrink`
* :ref:`cn_api_paddle_nn_functional_softsign`
* :ref:`cn_api_paddle_stanh`
* :ref:`cn_api_paddle_nn_functional_swish`
* :ref:`cn_api_paddle_tanh`
* :ref:`cn_api_paddle_nn_functional_thresholded_relu`
* :ref:`cn_api_paddle_nn_functional_tanhshrink`


**PaddlePaddle 应用激活函数的方式如下:**

PaddlePaddle 为每个 Activation 提供了接口,可以显式调用。以下是一个示例,展示如何在卷积操作之后应用 ReLU 激活函数:

.. code-block:: python

conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3)
relu1 = fluid.layers.relu(conv2d)
conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # 卷积
relu1 = paddle.nn.functional.relu(conv2d) # 使用 ReLu 激活函数
67 changes: 29 additions & 38 deletions docs/api_guides/low_level/layers/activations_en.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,44 +6,35 @@ Activation Function

The activation function incorporates non-linearity properties into the neural network.

PaddlePaddle Fluid supports most of the activation functions, including:

:ref:`api_fluid_layers_relu`,
:ref:`api_fluid_layers_tanh`,
:ref:`api_fluid_layers_sigmoid`,
:ref:`api_fluid_layers_elu`,
:ref:`api_fluid_layers_relu6`,
:ref:`api_fluid_layers_pow`,
:ref:`api_fluid_layers_stanh`,
:ref:`api_fluid_layers_hard_sigmoid`,
:ref:`api_fluid_layers_swish`,
:ref:`api_fluid_layers_prelu`,
:ref:`api_fluid_layers_brelu`,
:ref:`api_fluid_layers_leaky_relu`,
:ref:`api_fluid_layers_soft_relu`,
:ref:`api_fluid_layers_thresholded_relu`,
:ref:`api_fluid_layers_maxout`,
:ref:`api_fluid_layers_logsigmoid`,
:ref:`api_fluid_layers_hard_shrink`,
:ref:`api_fluid_layers_softsign`,
:ref:`api_fluid_layers_softplus`,
:ref:`api_fluid_layers_tanh_shrink`,
:ref:`api_fluid_layers_softshrink`,
:ref:`api_fluid_layers_exp`.


**Fluid provides two ways to use the activation function:**

- If a layer interface provides :code:`act` variables (default None), we can specify the type of layer activation function through this parameter. This mode supports common activation functions :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`.
PaddlePaddle supports most of the activation functions, including:

* :ref:`api_paddle_nn_functional_elu`
* :ref:`api_paddle_exp`
* :ref:`api_paddle_nn_functional_hardsigmoid`
* :ref:`api_paddle_nn_functional_hardshrink`
* :ref:`api_paddle_nn_functional_leaky_relu`
* :ref:`api_paddle_nn_functional_log_sigmoid`
* :ref:`api_paddle_nn_functional_maxout`
* :ref:`api_paddle_pow`
* :ref:`api_paddle_nn_functional_prelu`
* :ref:`api_paddle_nn_functional_relu`
* :ref:`api_paddle_nn_functional_relu6`
* :ref:`api_paddle_tensor_sigmoid`
* :ref:`api_paddle_nn_functional_softplus`
* :ref:`api_paddle_nn_functional_softshrink`
* :ref:`api_paddle_nn_functional_softsign`
* :ref:`api_paddle_stanh`
* :ref:`api_paddle_nn_functional_swish`
* :ref:`api_paddle_tanh`
* :ref:`api_paddle_nn_functional_thresholded_relu`
* :ref:`api_paddle_nn_functional_tanhshrink`


**The way to apply activation functions in PaddlePaddle is as follows:**

PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation:

.. code-block:: python

conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu")


- Fluid provides an interface for each Activation, and we can explicitly call it.

.. code-block:: python

conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3)
relu1 = fluid.layers.relu(conv2d)
conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # Convolution operation
relu1 = paddle.nn.functional.relu(conv2d) # Applying the ReLU activation function