Machine Learning 02- Basic Mathematics Related to Machine Learning

1) Paste the video study notes, which require authenticity, do not plagiarize, you can take pictures by handwriting.

       

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 p3 3. Matrix and linear algebra

 

 

 

 

 

 

 

 

2) Summarize "gradient", "gradient descent", and "Bayes' theorem" in your own words. Word editing, mind mapping, handwriting and photographing are required, and conciseness and neat layout are required.

Gradient: Gradient can sometimes be called slope. The gradient is a vector, which can be said to be a vector of all partial derivatives of a function.

Gradient descent: moving in the opposite direction of the gradient, the function value drops the fastest

Bayes' theorem: This theorem is mainly related to the prior probability and the posterior probability. P (B | A) = P (AB) / P (A), the above is obtained by conditional probability, and the following is the total probability formula.

      Transcendental probability: it is a probability that we blindly guess what happened before it happened

      Posterior probability: It is equivalent to a bottle of vinegar and a piece of beef on the table. If you eat beef and find that it is sour, you guess that 80% of this beef is added with vinegar.

Guess you like

Origin www.cnblogs.com/Fishmark/p/12700216.html