torch.nn.PReLU
prototype
CLASS torch.nn.PReLU(num_parameters=1, init=0.25, device=None, dtype=None)
parameter
- num_parameters ([ int ]) – the aa to learnThe number of a , although as input, only two values are legal,
1
or the number of channels of the input, which defaults to1
. - init ([float]) – a a The initial value of a , the default is
0.25
.
definition
PReLU ( x ) = max ( 0 , x ) + a ∗ min ( 0 , x ) \text{PReLU}(x)= \max(0, x) + a * \min(0, x) PRELU ( x )=max(0,x)+a∗min(0,x)
or
PReLU ( x ) = { x , if x ≥ 0 ax , otherwise \text{PReLU}(x) = \begin{cases}x, &\text{if}x \geq 0\\ax, &\text{otherwise } \end{cases}PRELU ( x )={ x,ax,ifx≥0otherwise
picture
the code
import torch
import torch.nn as nn
m = nn.PReLU()
input = torch.randn(4)
output = m(input)
print("input: ", input) # input: tensor([ 0.1061, -2.0532, 1.4081, -0.1516])
print("output: ", output) # output: tensor([ 0.1061, -0.5133, 1.4081, -0.0379], grad_fn=<PreluBackward>)