[PyTorch] Tutorial: torch.nn.PReLU

torch.nn.PReLU

prototype

CLASS torch.nn.PReLU(num_parameters=1, init=0.25, device=None, dtype=None)

parameter

  • num_parameters ([ int ]) – the aa to learnThe number of a , although as input, only two values ​​are legal,1or the number of channels of the input, which defaults to1.
  • init ([float]) – a a The initial value of a , the default is0.25.

definition

PReLU ( x ) = max ⁡ ( 0 , x ) + a ∗ min ⁡ ( 0 , x ) \text{PReLU}(x)= \max(0, x) + a * \min(0, x) PRELU ( x )=max(0,x)+amin(0,x)

or

PReLU ( x ) = { x , if x ≥ 0 ax , otherwise \text{PReLU}(x) = \begin{cases}x, &\text{if}x \geq 0\\ax, &\text{otherwise } \end{cases}PRELU ( x )={ x,ax,ifx0otherwise

picture

insert image description here

the code

import torch
import torch.nn as nn

m = nn.PReLU()
input = torch.randn(4)
output = m(input)

print("input: ", input)   # input:  tensor([ 0.1061, -2.0532,  1.4081, -0.1516])
print("output: ", output) # output:  tensor([ 0.1061, -0.5133,  1.4081, -0.0379], grad_fn=<PreluBackward>)

【reference】

PReLU — PyTorch 1.13 documentation

Guess you like

Origin blog.csdn.net/zhoujinwang/article/details/129351981