logistic regression

Prediction function: h θ ( x ) = g ( θ T x ) = 1 1 + e θ T x

h θ ( x ) The value of is the probability of y=1, 1- h θ ( x ) is the probability that y=0.

So y~B(1, h θ ( x ) ), a two-point distribution.

The distribution of y is listed as p ( and ) = ( h θ ( x ) ) and ( 1 h θ ( x ) ) 1 and

Likelihood function L ( θ ) = i = 1 m p ( and ) (meaning to maximize the probability of what has already happened)

The next step is to add log and find the process of partial derivative.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324837753&siteId=291194637