Logistic regression
Linear regression problem - how to determine whether the tumor malignant?
Linear regression robust enough, if there is noise, immediately, "surrender"
Logistic regression - Classification
Sigmoid function (compression function) between the smoothing step function 0-1 - Step solve the problem of insufficient nonlinearity function
redu rectifying linear function, step better than
We out linear regression fit to compress the compression function value, to make a 0.5 probability with the decision boundary compressed completed, the sample can be divided into two categories, i.e. positive and negative samples;
sigmoid function (using multiple neural networks), ez determines the sign of the value z g (z) is greater than the last 0.5 or less than 0.5; when z is greater than 0 i.e., g (z) is greater than 0.5, z of less than 0 when, g (z) is less than 0.5
when z is an expression corresponding to the classification boundary exactly classified negative z corresponding to different sides of the border, on both sides of the border will correspond to such classification g (z)> 0.5, and g (z) <0.5, so depending on the magnitude relation g (z) and 0.5, the classification can be achieved
Logistic regression loss function
Loss of function of the square of the problem
h (oX) the predicted value, y is the true value;
Gradient descent method to solve
Purpose machine learning or supervised learning is this gradient descent function ( loss of function and regular portrait )