【Course】Machine learning:Week 2-Lecture1-Gradient Descent For Multiple Variables

Gradient Descent For Multiple Variables

Issues raised: Week2 the issue of a single variable gradient descent turned into a multi-variable:

Picture Name

The formula is as follows:

Picture Name

Gradient descent algorithm

\[ \begin{array}{l}{\text { repeat until convergence: }\{} \\ {\theta_{j}:=\theta_{j}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h \theta\left(x^{(i)}\right)-y^{(i)}\right) \cdot x_{j}^{(i)} \quad \text { for } j:=0 \ldots n} \\ {\}}\end{array} \]
也就是:
\[ \begin{array}{l}{\text { repeat until convergence: }\{} \\ {\theta_{0}:=\theta_{0}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) \cdot x_{0}^{(i)}} \\ {\theta_{1}:=\theta_{1}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) \cdot x_{1}^{(i)}} \\ {\theta_{2}:=\theta_{2}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) \cdot x_{2}^{(i)}} \\ {\cdots} \\ {\}^{\cdots}}\end{array} \]
\(\theta_{0}\)\(\theta_{1}\), \ (\ Theta_ 2} {\) ... These parameters must also update

Guess you like

Origin www.cnblogs.com/Ireland/p/12337580.html