A survey of the activation function and loss of function

1) activation function (Activation Function )

  • background

The basic principle is that the depth of learning an artificial neural network, a signal entered from a neuron, through the nonlinear activation function, passed to the next layer neuron; and then through the layer neuron activate, continue to pass down, and so on reciprocating, until the output layer. Because of repeated superposition of these non-linear function, the neural network such that it has sufficient capacity to capture the complex pattern, made state-of-the-art results in various fields. Obvious, activation function in a pivotal study in depth, one is also very active area of ​​research. For now, choose what kind of activation function does not lie in its ability to simulate real neurons, but rather the ability to facilitate the optimization of the entire depth of the neural network. Below we briefly talk about the advantages and disadvantages of various types of functions and their application scenarios.

  • Sigmoid function
  • Sigmoid function of the depth of field of study is the beginning of the most frequently used activation function when
  • Pros

        It is to facilitate the smoothing function derivative which derivative is

the Cons

l prone gradient vanishing

l output function is not zero-centered

l exponentiation are relatively time-consuming

l Applicable scene: commonly used in the output layer, multi-application in binary classification, logistic regression and other neural networking tasks

 

 

 

Guess you like

Origin www.cnblogs.com/Kobaayyy/p/11854958.html