From 70f7968288c8313fe01c75ae8054bf6d5d662141 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Wed, 30 Apr 2025 01:24:22 +0800 Subject: [PATCH 1/8] fix the activation, sort the APIs based on their initials --- .../low_level/layers/activations.rst | 45 ++++++++++---- .../low_level/layers/activations_en.rst | 61 +++++++++---------- 2 files changed, 63 insertions(+), 43 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index 5fecf03707d..102694a009b 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -1,28 +1,49 @@ .. _api_guide_activations: -#### +################### 激活函数 -#### +################### 激活函数将非线性的特性引入到神经网络当中。 -PaddlePaddle Fluid 对大部分的激活函数进行了支持,其中有: - -:ref:`cn_api_fluid_layers_relu`, :ref:`cn_api_fluid_layers_tanh`, :ref:`cn_api_fluid_layers_sigmoid`, :ref:`cn_api_fluid_layers_elu`, :ref:`cn_api_fluid_layers_relu6`, :ref:`cn_api_fluid_layers_pow`, :ref:`cn_api_fluid_layers_stanh`, :ref:`cn_api_fluid_layers_hard_sigmoid`, :ref:`cn_api_fluid_layers_swish`, :ref:`cn_api_fluid_layers_prelu`, :ref:`cn_api_fluid_layers_brelu`, :ref:`cn_api_fluid_layers_leaky_relu`, :ref:`cn_api_fluid_layers_soft_relu`, :ref:`cn_api_fluid_layers_thresholded_relu`, :ref:`cn_api_fluid_layers_maxout`, :ref:`cn_api_fluid_layers_logsigmoid`, :ref:`cn_api_fluid_layers_hard_shrink`, :ref:`cn_api_fluid_layers_softsign`, :ref:`cn_api_fluid_layers_softplus`, :ref:`cn_api_fluid_layers_tanh_shrink`, :ref:`cn_api_fluid_layers_softshrink`, :ref:`cn_api_fluid_layers_exp`。 - - -**Fluid 提供了两种使用激活函数的方式:** +PaddlePaddle 对大部分的激活函数进行了支持,其中有: + +- :ref:`cn_api_exp` +- :ref:`cn_api_pow` +- :ref:`cn_api_stanh` +- :ref:`cn_api_nn_functional_elu` +- :ref:`cn_api_nn_functional_hard_sigmoid` +- :ref:`cn_api_nn_functional_hard_shrink` +- :ref:`cn_api_nn_functional_leaky_relu` +- :ref:`cn_api_nn_functional_logsigmoid` +- :ref:`cn_api_nn_functional_maxout` +- :ref:`cn_api_nn_functional_prelu` +- :ref:`cn_api_static_nn_prelu` +- :ref:`cn_api_nn_functional_relu` +- :ref:`cn_api_nn_functional_relu6` +- :ref:`cn_api_nn_functional_sigmoid` +- :ref:`cn_api_nn_functional_softplus` +- :ref:`cn_api_nn_functional_softshrink` +- :ref:`cn_api_nn_functional_softsign` +- :ref:`cn_api_nn_functional_swish` +- :ref:`cn_api_nn_functional_thresholded_relu` +- :ref:`cn_api_nn_functional_tanh` +- :ref:`cn_api_nn_functional_tanh_shrink` + + +**PaddlePaddle 提供了两种使用激活函数的方式:** - 如果一个层的接口提供了 :code:`act` 变量(默认值为 None),我们可以通过该变量指定该层的激活函数类型。该方式支持常见的激活函数: :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`。 .. code-block:: python - conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu") + conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3, act="relu") + +- PaddlePaddle 为每个 Activation 提供了接口,我们可以显式的对它们进行调用。 -- Fluid 为每个 Activation 提供了接口,我们可以显式的对它们进行调用。 .. code-block:: python - conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3) - relu1 = fluid.layers.relu(conv2d) + conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3) + relu1 = nn.functional.relu(conv2d) diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index 53829ae5696..2afcdf212f4 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -6,44 +6,43 @@ Activation Function The activation function incorporates non-linearity properties into the neural network. -PaddlePaddle Fluid supports most of the activation functions, including: - -:ref:`api_fluid_layers_relu`, -:ref:`api_fluid_layers_tanh`, -:ref:`api_fluid_layers_sigmoid`, -:ref:`api_fluid_layers_elu`, -:ref:`api_fluid_layers_relu6`, -:ref:`api_fluid_layers_pow`, -:ref:`api_fluid_layers_stanh`, -:ref:`api_fluid_layers_hard_sigmoid`, -:ref:`api_fluid_layers_swish`, -:ref:`api_fluid_layers_prelu`, -:ref:`api_fluid_layers_brelu`, -:ref:`api_fluid_layers_leaky_relu`, -:ref:`api_fluid_layers_soft_relu`, -:ref:`api_fluid_layers_thresholded_relu`, -:ref:`api_fluid_layers_maxout`, -:ref:`api_fluid_layers_logsigmoid`, -:ref:`api_fluid_layers_hard_shrink`, -:ref:`api_fluid_layers_softsign`, -:ref:`api_fluid_layers_softplus`, -:ref:`api_fluid_layers_tanh_shrink`, -:ref:`api_fluid_layers_softshrink`, -:ref:`api_fluid_layers_exp`. - - -**Fluid provides two ways to use the activation function:** +PaddlePaddle supports most of the activation functions, including: + +- :ref:`api_exp` +- :ref:`api_pow` +- :ref:`api_stanh` +- :ref:`api_nn_functional_elu` +- :ref:`api_nn_functional_hard_sigmoid` +- :ref:`api_nn_functional_hard_shrink` +- :ref:`api_nn_functional_leaky_relu` +- :ref:`api_nn_functional_logsigmoid` +- :ref:`api_nn_functional_maxout` +- :ref:`api_nn_functional_prelu` +- :ref:`api_static_nn_prelu` +- :ref:`api_nn_functional_relu` +- :ref:`api_nn_functional_relu6` +- :ref:`api_nn_functional_sigmoid` +- :ref:`api_nn_functional_softplus` +- :ref:`api_nn_functional_softshrink` +- :ref:`api_nn_functional_softsign` +- :ref:`api_nn_functional_swish` +- :ref:`api_nn_functional_thresholded_relu` +- :ref:`api_nn_functional_tanh` +- :ref:`api_nn_functional_tanh_shrink` + + +**PaddlePaddle provides two ways to use the activation function:** - If a layer interface provides :code:`act` variables (default None), we can specify the type of layer activation function through this parameter. This mode supports common activation functions :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`. .. code-block:: python - conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu") + conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3, act="relu") -- Fluid provides an interface for each Activation, and we can explicitly call it. +- PaddlePaddle provides an interface for each Activation, and we can explicitly call it. .. code-block:: python - conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3) - relu1 = fluid.layers.relu(conv2d) + conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3) + relu1 = nn.functional.relu(conv2d) From 33d9f44ab3170aa5dc6fa9d3d919db3bf71471b9 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 11:06:29 +0800 Subject: [PATCH 2/8] fix according to review, fix the ref, update the conv2d --- .../low_level/layers/activations.rst | 64 ++++++++----------- .../low_level/layers/activations_en.rst | 63 ++++++++---------- 2 files changed, 56 insertions(+), 71 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index 102694a009b..f4e2fe2f163 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -8,42 +8,34 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: -- :ref:`cn_api_exp` -- :ref:`cn_api_pow` -- :ref:`cn_api_stanh` -- :ref:`cn_api_nn_functional_elu` -- :ref:`cn_api_nn_functional_hard_sigmoid` -- :ref:`cn_api_nn_functional_hard_shrink` -- :ref:`cn_api_nn_functional_leaky_relu` -- :ref:`cn_api_nn_functional_logsigmoid` -- :ref:`cn_api_nn_functional_maxout` -- :ref:`cn_api_nn_functional_prelu` -- :ref:`cn_api_static_nn_prelu` -- :ref:`cn_api_nn_functional_relu` -- :ref:`cn_api_nn_functional_relu6` -- :ref:`cn_api_nn_functional_sigmoid` -- :ref:`cn_api_nn_functional_softplus` -- :ref:`cn_api_nn_functional_softshrink` -- :ref:`cn_api_nn_functional_softsign` -- :ref:`cn_api_nn_functional_swish` -- :ref:`cn_api_nn_functional_thresholded_relu` -- :ref:`cn_api_nn_functional_tanh` -- :ref:`cn_api_nn_functional_tanh_shrink` - - -**PaddlePaddle 提供了两种使用激活函数的方式:** - -- 如果一个层的接口提供了 :code:`act` 变量(默认值为 None),我们可以通过该变量指定该层的激活函数类型。该方式支持常见的激活函数: :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`。 +* :ref:`cn_api_exp` +* :ref:`cn_api_pow` +* :ref:`cn_api_stanh` +* :ref:`cn_api_nn_functional_elu` +* :ref:`cn_api_nn_functional_hard_sigmoid` +* :ref:`cn_api_nn_functional_hard_shrink` +* :ref:`cn_api_nn_functional_leaky_relu` +* :ref:`cn_api_nn_functional_logsigmoid` +* :ref:`cn_api_nn_functional_maxout` +* :ref:`cn_api_nn_functional_prelu` +* :ref:`cn_api_static_nn_prelu` +* :ref:`cn_api_nn_functional_relu` +* :ref:`cn_api_nn_functional_relu6` +* :ref:`cn_api_nn_functional_sigmoid` +* :ref:`cn_api_nn_functional_softplus` +* :ref:`cn_api_nn_functional_softshrink` +* :ref:`cn_api_nn_functional_softsign` +* :ref:`cn_api_nn_functional_swish` +* :ref:`cn_api_nn_functional_thresholded_relu` +* :ref:`cn_api_nn_functional_tanh` +* :ref:`cn_api_nn_functional_tanh_shrink` + + +**PaddlePaddle 应用激活函数的方式如下:** + +- PaddlePaddle 为每个 Activation 提供了接口,我们可以显式的对它们进行调用,以下是在卷积作后应用 ReLU 激活函数的示例: .. code-block:: python - conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3, act="relu") - - -- PaddlePaddle 为每个 Activation 提供了接口,我们可以显式的对它们进行调用。 - - -.. code-block:: python - - conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3) - relu1 = nn.functional.relu(conv2d) + conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # 卷积 + relu1 = paddle.nn.functional.relu(conv2d) # 使用 ReLu 激活函数 diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index 2afcdf212f4..408bcf61e64 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -8,41 +8,34 @@ The activation function incorporates non-linearity properties into the neural ne PaddlePaddle supports most of the activation functions, including: -- :ref:`api_exp` -- :ref:`api_pow` -- :ref:`api_stanh` -- :ref:`api_nn_functional_elu` -- :ref:`api_nn_functional_hard_sigmoid` -- :ref:`api_nn_functional_hard_shrink` -- :ref:`api_nn_functional_leaky_relu` -- :ref:`api_nn_functional_logsigmoid` -- :ref:`api_nn_functional_maxout` -- :ref:`api_nn_functional_prelu` -- :ref:`api_static_nn_prelu` -- :ref:`api_nn_functional_relu` -- :ref:`api_nn_functional_relu6` -- :ref:`api_nn_functional_sigmoid` -- :ref:`api_nn_functional_softplus` -- :ref:`api_nn_functional_softshrink` -- :ref:`api_nn_functional_softsign` -- :ref:`api_nn_functional_swish` -- :ref:`api_nn_functional_thresholded_relu` -- :ref:`api_nn_functional_tanh` -- :ref:`api_nn_functional_tanh_shrink` - - -**PaddlePaddle provides two ways to use the activation function:** - -- If a layer interface provides :code:`act` variables (default None), we can specify the type of layer activation function through this parameter. This mode supports common activation functions :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`. +* :ref:`api_exp` +* :ref:`api_pow` +* :ref:`api_stanh` +* :ref:`api_nn_functional_elu` +* :ref:`api_nn_functional_hard_sigmoid` +* :ref:`api_nn_functional_hard_shrink` +* :ref:`api_nn_functional_leaky_relu` +* :ref:`api_nn_functional_logsigmoid` +* :ref:`api_nn_functional_maxout` +* :ref:`api_nn_functional_prelu` +* :ref:`api_static_nn_prelu` +* :ref:`api_nn_functional_relu` +* :ref:`api_nn_functional_relu6` +* :ref:`api_nn_functional_sigmoid` +* :ref:`api_nn_functional_softplus` +* :ref:`api_nn_functional_softshrink` +* :ref:`api_nn_functional_softsign` +* :ref:`api_nn_functional_swish` +* :ref:`api_nn_functional_thresholded_relu` +* :ref:`api_nn_functional_tanh` +* :ref:`api_nn_functional_tanh_shrink` + + +**The way to apply activation functions in PaddlePaddle is as follows:** + +- PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation: .. code-block:: python - conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3, act="relu") - - -- PaddlePaddle provides an interface for each Activation, and we can explicitly call it. - -.. code-block:: python - - conv2d = nn.functional.conv2d(input=data, num_filters=2, filter_size=3) - relu1 = nn.functional.relu(conv2d) + conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # Convolution operation + relu1 = paddle.nn.functional.relu(conv2d) # Applying the ReLU activation function From 5525e4d3894beb5212a15f40dca23c12e93ff4b4 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 12:47:41 +0800 Subject: [PATCH 3/8] =?UTF-8?q?=E6=B8=B2=E6=9F=93=E9=97=AE=E9=A2=98?= =?UTF-8?q?=EF=BC=8C=E5=BA=94=E8=AF=A5=E6=98=AF:ref:`cn=5Fapi=5Fpaddle=5Fe?= =?UTF-8?q?xp`=20=EF=BC=8C=E5=9D=87=E9=9C=80=E8=A6=81=E5=8A=A0=E4=B8=8A=5F?= =?UTF-8?q?paddle=5F?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .../low_level/layers/activations.rst | 42 +++++++++---------- .../low_level/layers/activations_en.rst | 42 +++++++++---------- 2 files changed, 42 insertions(+), 42 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index f4e2fe2f163..b8455260f86 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -8,27 +8,27 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: -* :ref:`cn_api_exp` -* :ref:`cn_api_pow` -* :ref:`cn_api_stanh` -* :ref:`cn_api_nn_functional_elu` -* :ref:`cn_api_nn_functional_hard_sigmoid` -* :ref:`cn_api_nn_functional_hard_shrink` -* :ref:`cn_api_nn_functional_leaky_relu` -* :ref:`cn_api_nn_functional_logsigmoid` -* :ref:`cn_api_nn_functional_maxout` -* :ref:`cn_api_nn_functional_prelu` -* :ref:`cn_api_static_nn_prelu` -* :ref:`cn_api_nn_functional_relu` -* :ref:`cn_api_nn_functional_relu6` -* :ref:`cn_api_nn_functional_sigmoid` -* :ref:`cn_api_nn_functional_softplus` -* :ref:`cn_api_nn_functional_softshrink` -* :ref:`cn_api_nn_functional_softsign` -* :ref:`cn_api_nn_functional_swish` -* :ref:`cn_api_nn_functional_thresholded_relu` -* :ref:`cn_api_nn_functional_tanh` -* :ref:`cn_api_nn_functional_tanh_shrink` +* :ref:`cn_api_paddle_exp` +* :ref:`cn_api_paddle_pow` +* :ref:`cn_api_paddle_stanh` +* :ref:`cn_api_paddle_nn_functional_elu` +* :ref:`cn_api_paddle_nn_functional_hard_sigmoid` +* :ref:`cn_api_paddle_nn_functional_hard_shrink` +* :ref:`cn_api_paddle_nn_functional_leaky_relu` +* :ref:`cn_api_paddle_nn_functional_logsigmoid` +* :ref:`cn_api_paddle_nn_functional_maxout` +* :ref:`cn_api_paddle_nn_functional_prelu` +* :ref:`cn_api_paddle_static_nn_prelu` +* :ref:`cn_api_paddle_nn_functional_relu` +* :ref:`cn_api_paddle_nn_functional_relu6` +* :ref:`cn_api_paddle_nn_functional_sigmoid` +* :ref:`cn_api_paddle_nn_functional_softplus` +* :ref:`cn_api_paddle_nn_functional_softshrink` +* :ref:`cn_api_paddle_nn_functional_softsign` +* :ref:`cn_api_paddle_nn_functional_swish` +* :ref:`cn_api_paddle_nn_functional_thresholded_relu` +* :ref:`cn_api_paddle_nn_functional_tanh` +* :ref:`cn_api_paddle_nn_functional_tanh_shrink` **PaddlePaddle 应用激活函数的方式如下:** diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index 408bcf61e64..a7046c0ee0c 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -8,27 +8,27 @@ The activation function incorporates non-linearity properties into the neural ne PaddlePaddle supports most of the activation functions, including: -* :ref:`api_exp` -* :ref:`api_pow` -* :ref:`api_stanh` -* :ref:`api_nn_functional_elu` -* :ref:`api_nn_functional_hard_sigmoid` -* :ref:`api_nn_functional_hard_shrink` -* :ref:`api_nn_functional_leaky_relu` -* :ref:`api_nn_functional_logsigmoid` -* :ref:`api_nn_functional_maxout` -* :ref:`api_nn_functional_prelu` -* :ref:`api_static_nn_prelu` -* :ref:`api_nn_functional_relu` -* :ref:`api_nn_functional_relu6` -* :ref:`api_nn_functional_sigmoid` -* :ref:`api_nn_functional_softplus` -* :ref:`api_nn_functional_softshrink` -* :ref:`api_nn_functional_softsign` -* :ref:`api_nn_functional_swish` -* :ref:`api_nn_functional_thresholded_relu` -* :ref:`api_nn_functional_tanh` -* :ref:`api_nn_functional_tanh_shrink` +* :ref:`api_paddle_exp` +* :ref:`api_paddle_pow` +* :ref:`api_paddle_stanh` +* :ref:`api_paddle_nn_functional_elu` +* :ref:`api_paddle_nn_functional_hard_sigmoid` +* :ref:`api_paddle_nn_functional_hard_shrink` +* :ref:`api_paddle_nn_functional_leaky_relu` +* :ref:`api_paddle_nn_functional_logsigmoid` +* :ref:`api_paddle_nn_functional_maxout` +* :ref:`api_paddle_nn_functional_prelu` +* :ref:`api_paddle_static_nn_prelu` +* :ref:`api_paddle_nn_functional_relu` +* :ref:`api_paddle_nn_functional_relu6` +* :ref:`api_paddle_nn_functional_sigmoid` +* :ref:`api_paddle_nn_functional_softplus` +* :ref:`api_paddle_nn_functional_softshrink` +* :ref:`api_paddle_nn_functional_softsign` +* :ref:`api_paddle_nn_functional_swish` +* :ref:`api_paddle_nn_functional_thresholded_relu` +* :ref:`api_paddle_nn_functional_tanh` +* :ref:`api_paddle_nn_functional_tanh_shrink` **The way to apply activation functions in PaddlePaddle is as follows:** From 15f265f57f877aec125dd122bf062516ee76c66c Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 13:43:42 +0800 Subject: [PATCH 4/8] =?UTF-8?q?=E4=BF=AE=E5=A4=8D=E4=B8=80=E4=BA=9B?= =?UTF-8?q?=E6=96=87=E6=A1=A3=E7=89=88=E6=9C=AC=E4=B8=8D=E5=8C=B9=E9=85=8D?= =?UTF-8?q?=E9=97=AE=E9=A2=98=EF=BC=8C=E6=AF=94=E5=A6=82=E9=83=A8=E5=88=86?= =?UTF-8?q?api=E5=88=A0=E9=99=A4=E4=BA=86=E4=B8=8B=E5=88=92=E7=BA=BF?= =?UTF-8?q?=EF=BC=9A=20hardsigmoid=20hardshrink=20tanhshrink?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit LogSigmoid不存在激活函数的使用 --- docs/api_guides/low_level/layers/activations.rst | 9 ++++----- docs/api_guides/low_level/layers/activations_en.rst | 9 ++++----- 2 files changed, 8 insertions(+), 10 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index b8455260f86..66430c7e9e2 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -12,10 +12,9 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: * :ref:`cn_api_paddle_pow` * :ref:`cn_api_paddle_stanh` * :ref:`cn_api_paddle_nn_functional_elu` -* :ref:`cn_api_paddle_nn_functional_hard_sigmoid` -* :ref:`cn_api_paddle_nn_functional_hard_shrink` +* :ref:`cn_api_paddle_nn_functional_hardsigmoid` +* :ref:`cn_api_paddle_nn_functional_hardshrink` * :ref:`cn_api_paddle_nn_functional_leaky_relu` -* :ref:`cn_api_paddle_nn_functional_logsigmoid` * :ref:`cn_api_paddle_nn_functional_maxout` * :ref:`cn_api_paddle_nn_functional_prelu` * :ref:`cn_api_paddle_static_nn_prelu` @@ -27,8 +26,8 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: * :ref:`cn_api_paddle_nn_functional_softsign` * :ref:`cn_api_paddle_nn_functional_swish` * :ref:`cn_api_paddle_nn_functional_thresholded_relu` -* :ref:`cn_api_paddle_nn_functional_tanh` -* :ref:`cn_api_paddle_nn_functional_tanh_shrink` +* :ref:`cn_api_paddle_tanh` +* :ref:`cn_api_paddle_nn_functional_tanhshrink` **PaddlePaddle 应用激活函数的方式如下:** diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index a7046c0ee0c..f568b69f572 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -12,10 +12,9 @@ PaddlePaddle supports most of the activation functions, including: * :ref:`api_paddle_pow` * :ref:`api_paddle_stanh` * :ref:`api_paddle_nn_functional_elu` -* :ref:`api_paddle_nn_functional_hard_sigmoid` -* :ref:`api_paddle_nn_functional_hard_shrink` +* :ref:`api_paddle_nn_functional_hardsigmoid` +* :ref:`api_paddle_nn_functional_hardshrink` * :ref:`api_paddle_nn_functional_leaky_relu` -* :ref:`api_paddle_nn_functional_logsigmoid` * :ref:`api_paddle_nn_functional_maxout` * :ref:`api_paddle_nn_functional_prelu` * :ref:`api_paddle_static_nn_prelu` @@ -27,8 +26,8 @@ PaddlePaddle supports most of the activation functions, including: * :ref:`api_paddle_nn_functional_softsign` * :ref:`api_paddle_nn_functional_swish` * :ref:`api_paddle_nn_functional_thresholded_relu` -* :ref:`api_paddle_nn_functional_tanh` -* :ref:`api_paddle_nn_functional_tanh_shrink` +* :ref:`api_paddle_tanh` +* :ref:`api_paddle_nn_functional_tanhshrink` **The way to apply activation functions in PaddlePaddle is as follows:** From 89835b15945d3c319a945f05a0c0d72d51ff2e38 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 13:49:07 +0800 Subject: [PATCH 5/8] =?UTF-8?q?=E5=B0=86api=E6=8C=89=E7=85=A7=E9=A6=96?= =?UTF-8?q?=E5=AD=97=E6=AF=8D=E6=8E=92=E5=BA=8F=EF=BC=8C=E5=88=A0=E9=99=A4?= =?UTF-8?q?=E4=BA=86=E5=A4=9A=E4=BD=99=E7=9A=84prelu=E6=BF=80=E6=B4=BB?= =?UTF-8?q?=E5=B1=82=E3=80=82=E4=B9=8B=E5=89=8D=E6=9C=89=E4=B8=A4=E4=B8=AA?= =?UTF-8?q?prelu=EF=BC=8C=E4=BD=86=E6=98=AF=E4=B8=80=E4=B8=AA=E6=98=AF?= =?UTF-8?q?=E6=BF=80=E6=B4=BB=E5=87=BD=E6=95=B0=EF=BC=8C=E5=8F=A6=E4=B8=80?= =?UTF-8?q?=E4=B8=AA=E6=98=AFnn.static=E4=B8=8B=E7=9A=84=E6=BF=80=E6=B4=BB?= =?UTF-8?q?=E5=B1=82=EF=BC=88=E8=A2=AB=E5=88=A0=E9=99=A4=EF=BC=89=E3=80=82?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/api_guides/low_level/layers/activations.rst | 9 ++++----- docs/api_guides/low_level/layers/activations_en.rst | 9 ++++----- 2 files changed, 8 insertions(+), 10 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index 66430c7e9e2..e988c32db6d 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -8,25 +8,24 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: -* :ref:`cn_api_paddle_exp` -* :ref:`cn_api_paddle_pow` -* :ref:`cn_api_paddle_stanh` * :ref:`cn_api_paddle_nn_functional_elu` +* :ref:`cn_api_paddle_exp` * :ref:`cn_api_paddle_nn_functional_hardsigmoid` * :ref:`cn_api_paddle_nn_functional_hardshrink` * :ref:`cn_api_paddle_nn_functional_leaky_relu` * :ref:`cn_api_paddle_nn_functional_maxout` +* :ref:`cn_api_paddle_pow` * :ref:`cn_api_paddle_nn_functional_prelu` -* :ref:`cn_api_paddle_static_nn_prelu` * :ref:`cn_api_paddle_nn_functional_relu` * :ref:`cn_api_paddle_nn_functional_relu6` * :ref:`cn_api_paddle_nn_functional_sigmoid` * :ref:`cn_api_paddle_nn_functional_softplus` * :ref:`cn_api_paddle_nn_functional_softshrink` * :ref:`cn_api_paddle_nn_functional_softsign` +* :ref:`cn_api_paddle_stanh` * :ref:`cn_api_paddle_nn_functional_swish` -* :ref:`cn_api_paddle_nn_functional_thresholded_relu` * :ref:`cn_api_paddle_tanh` +* :ref:`cn_api_paddle_nn_functional_thresholded_relu` * :ref:`cn_api_paddle_nn_functional_tanhshrink` diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index f568b69f572..fe6f5f5a5b4 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -8,25 +8,24 @@ The activation function incorporates non-linearity properties into the neural ne PaddlePaddle supports most of the activation functions, including: -* :ref:`api_paddle_exp` -* :ref:`api_paddle_pow` -* :ref:`api_paddle_stanh` * :ref:`api_paddle_nn_functional_elu` +* :ref:`api_paddle_exp` * :ref:`api_paddle_nn_functional_hardsigmoid` * :ref:`api_paddle_nn_functional_hardshrink` * :ref:`api_paddle_nn_functional_leaky_relu` * :ref:`api_paddle_nn_functional_maxout` +* :ref:`api_paddle_pow` * :ref:`api_paddle_nn_functional_prelu` -* :ref:`api_paddle_static_nn_prelu` * :ref:`api_paddle_nn_functional_relu` * :ref:`api_paddle_nn_functional_relu6` * :ref:`api_paddle_nn_functional_sigmoid` * :ref:`api_paddle_nn_functional_softplus` * :ref:`api_paddle_nn_functional_softshrink` * :ref:`api_paddle_nn_functional_softsign` +* :ref:`api_paddle_stanh` * :ref:`api_paddle_nn_functional_swish` -* :ref:`api_paddle_nn_functional_thresholded_relu` * :ref:`api_paddle_tanh` +* :ref:`api_paddle_nn_functional_thresholded_relu` * :ref:`api_paddle_nn_functional_tanhshrink` From 453030e102dd9f793e41e867824856eea44a1654 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 14:08:00 +0800 Subject: [PATCH 6/8] =?UTF-8?q?=E6=89=BE=E5=88=B0log=5Fsigmoid=E4=BA=86?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/api_guides/low_level/layers/activations.rst | 1 + docs/api_guides/low_level/layers/activations_en.rst | 1 + 2 files changed, 2 insertions(+) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index e988c32db6d..9deed3258e6 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -13,6 +13,7 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: * :ref:`cn_api_paddle_nn_functional_hardsigmoid` * :ref:`cn_api_paddle_nn_functional_hardshrink` * :ref:`cn_api_paddle_nn_functional_leaky_relu` +* :ref:`cn_api_paddle_nn_functional_log_sigmoid` * :ref:`cn_api_paddle_nn_functional_maxout` * :ref:`cn_api_paddle_pow` * :ref:`cn_api_paddle_nn_functional_prelu` diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index fe6f5f5a5b4..b323f1d7126 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -13,6 +13,7 @@ PaddlePaddle supports most of the activation functions, including: * :ref:`api_paddle_nn_functional_hardsigmoid` * :ref:`api_paddle_nn_functional_hardshrink` * :ref:`api_paddle_nn_functional_leaky_relu` +* :ref:`api_paddle_nn_functional_log_sigmoid` * :ref:`api_paddle_nn_functional_maxout` * :ref:`api_paddle_pow` * :ref:`api_paddle_nn_functional_prelu` From c00f281ecf76bc6b6644fc499edcc980f9044c15 Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 14:29:00 +0800 Subject: [PATCH 7/8] =?UTF-8?q?=E4=BF=AE=E6=94=B9=E4=B8=80=E4=B8=8B?= =?UTF-8?q?=E4=BB=A3=E7=A0=81=E7=A4=BA=E4=BE=8B=E9=83=A8=E5=88=86=E7=9A=84?= =?UTF-8?q?=E6=8E=AA=E8=BE=9E?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/api_guides/low_level/layers/activations.rst | 2 +- docs/api_guides/low_level/layers/activations_en.rst | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/api_guides/low_level/layers/activations.rst b/docs/api_guides/low_level/layers/activations.rst index 9deed3258e6..b9be894f07c 100644 --- a/docs/api_guides/low_level/layers/activations.rst +++ b/docs/api_guides/low_level/layers/activations.rst @@ -32,7 +32,7 @@ PaddlePaddle 对大部分的激活函数进行了支持,其中有: **PaddlePaddle 应用激活函数的方式如下:** -- PaddlePaddle 为每个 Activation 提供了接口,我们可以显式的对它们进行调用,以下是在卷积作后应用 ReLU 激活函数的示例: +PaddlePaddle 为每个 Activation 提供了接口,可以显式调用。以下是一个示例,展示如何在卷积操作之后应用 ReLU 激活函数: .. code-block:: python diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index b323f1d7126..c8eed1e1641 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -32,7 +32,7 @@ PaddlePaddle supports most of the activation functions, including: **The way to apply activation functions in PaddlePaddle is as follows:** -- PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation: +PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation: .. code-block:: python From 1d2d3032e23fa693e17710856cb2567e6c34587d Mon Sep 17 00:00:00 2001 From: nyx-c-language Date: Thu, 8 May 2025 15:43:49 +0800 Subject: [PATCH 8/8] =?UTF-8?q?=E4=BF=AE=E5=A4=8D=E8=8B=B1=E6=96=87?= =?UTF-8?q?=E6=96=87=E6=A1=A3=E7=9A=84sigmoid=E6=B8=B2=E6=9F=93=E9=97=AE?= =?UTF-8?q?=E9=A2=98=EF=BC=8C=E5=9B=A0=E4=B8=BA=E4=BD=9C=E4=B8=BA=E7=AE=97?= =?UTF-8?q?=E5=AD=90=E6=94=BE=E5=9C=A8=E4=BA=86paddle.tensor=E4=B8=8B?= =?UTF-8?q?=EF=BC=8C=E8=80=8C=E4=B8=8D=E5=9C=A8nn.functional=E4=B8=8B?= =?UTF-8?q?=E4=BA=86=E3=80=82?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/api_guides/low_level/layers/activations_en.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/api_guides/low_level/layers/activations_en.rst b/docs/api_guides/low_level/layers/activations_en.rst index c8eed1e1641..cbda7083b39 100644 --- a/docs/api_guides/low_level/layers/activations_en.rst +++ b/docs/api_guides/low_level/layers/activations_en.rst @@ -19,7 +19,7 @@ PaddlePaddle supports most of the activation functions, including: * :ref:`api_paddle_nn_functional_prelu` * :ref:`api_paddle_nn_functional_relu` * :ref:`api_paddle_nn_functional_relu6` -* :ref:`api_paddle_nn_functional_sigmoid` +* :ref:`api_paddle_tensor_sigmoid` * :ref:`api_paddle_nn_functional_softplus` * :ref:`api_paddle_nn_functional_softshrink` * :ref:`api_paddle_nn_functional_softsign`