Let me talk about, ML white.
This is the first time to write something similar to a personal blog,
The main function is said to see the sigmoid, sigmoid function is machine learning in a more commonly used functions Similarly there softplus and softmax and other functions will not say here, take a look at the sigmoid function and expression image
sigmoid function expression as follows
This is an expression of the sigmoid function, this function is very useful in the Bernoulli distribution, and now look at his picture became clear
Tends to be seen in the positive or negative infinity, smooth function approach state, sigmoid function because the output range (0,1), so the probability of two-class often use this function, in fact logisti return to the use of this function is also a lot of tutorials said several advantages
A range between 0 and 1
2 functions with very good symmetry
Function exceeds a certain range will be less sensitive to input
sigmoid output between 0 and 1, we are in the second classification task, using sigmoid output is the probability of an event, that is, when the output meets the conditions meet certain probability we will be divided positive class, unlike svm.
Original https://www.jianshu.com/p/506595ec4b58