# introduce

Logistic regression is one of the most basic classification algorithms. Its model is $H_{\Theta }(x)=sigmoid(\Theta ^{T}X)$ , where $sigmoid(x)=\frac{1}{1+e^{-x}}$ .

its cost function $J(\Theta )=-\frac{1}{m}\sum_{i=0}^{m}(y^{i}log(H_{\Theta }(x^{i}))+(1-y^{i})log(1-H_{\Theta }(x^{i})))$ .
For the binary classification problem, the values ​​of y are 0 and 1. Here, we $H_{\Theta }(x)$ set y=1 probability. When it is greater than or equal to 0.5, we predict the result to be 1, and when it is less than 0.5, we predict the result to be 0.

Iterative formula: $\Theta_{j} =\Theta_{j} -\alpha\frac{\partial J(\Theta )}{\partial \Theta_{j} }$ where $\frac{\partial J(\Theta )}{\partial \Theta_{j} }=\frac{1}{m}\sum_{i=1}^{m}(H_{\Theta }(x^{i})-y^{i})x_{j}^{i}$ . The derivation process is shown in the figure below.

Vectorized expression: $\Theta =\Theta -\alpha X^{T}(\frac{1}{1+e^{-X\Theta }}-y)$ .

### Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326171620&siteId=291194637
Recommended
Ranking
Daily