Andrew Ng classification machine learning _46 / 47 assume the function / 48 decision boundary

Logistic regression

A, Classification (classification)

0: Negative class, said "no", such as benign tumors

1: positive class, represents "yes", such as cancer

The linear regression is applied to classification is not the best, then introduces logistic regression algorithm, a classification algorithm which is used for the case where the discrete value y equal to 0 and 1.

 

Second, assuming that the function

In the logistic regression model, we hope to 0 <= H [theta] (X) <=. 1, H [theta] (X) = G ([theta] T X) , and therefore H [theta] (X) =. 1 / (. 1 + E -θTx )

g (z) =. 1 / (. 1 + E the -Z ) , it can be seen from the image, the value of g (z) is between 0-1

 

 Third, the decision boundary (Decision Boundary)

, Assuming the function hθ (x) y = output probability is greater than or equal to 0.5, means that y is equal to 1 is more likely, we predict y = 1; if the probability is less than y = 0.5, y = 0 we predict 

 

I.e., [theta] T X> = 0, then H [theta] (X) = G ([theta] T X)> = 0.5

1. linear decision boundary

 

 

 2. A non-linear decision boundary

 

Guess you like

Origin www.cnblogs.com/vzyk/p/11568950.html