table of Contents
Task 1: deep learning activation function, kaggle-kernel environmental practices! -pytorch version
6 activation function
think three issues, the central point where the functional is how the (mathematical formulas), range
sigmoid: gradient disappears
tanh: gradient disappears
relu: resnet more layers of
softplus smooth RELU
LeakyReLU vulnerable the RELU
ELU
Task 2: deep learning loss function, kaggle-kernel environmental practices! -pytorch version
Figure out the classification, regression or
return loss function
L1_loss absolute value
L2_loss square MSE_loss
Classified loss function
Hinge Loss ------- pytorch MultiMarginLoss
Cross Entropy Loss