[2014 Stanford Machine Learning tutorial notes] Chapter 3 - matrix multiplication

    This section tells of a matrix multiplication of the matrix. This way we discuss the linear regression to simultaneously solve [theta] 1 and [theta] 0 computing problems without gradient descent.

    First, let's look at an example, let's say I have the following two matrices, and now we have to calculate the result after they multiplied.

    In the above example, a matrix-matrix of a 2 × 3 3 × 2 is obtained by multiplying a matrix of 2 × 2.

    Next, we talk about the rules of matrix multiplication.

    Next, let's look at an example.

    Let's assume that we want to predict the price of four houses, but we only have three hypotheses. If we want to have three hypotheses for these four houses, then there is an efficient way is to make use of matrices and matrix multiplication.

    Result of the first matrix column is based on the prediction result obtained by the first hypothesis, a second hypothesis corresponding to the second column, the third column corresponds third hypothesis. This example of a matrix multiplication by doing get 12 kinds of predictions. Even better, in order to achieve inter-matrix multiplication, there are many good linear algebra library can help us achieve matrix multiplication.

Guess you like

Origin www.cnblogs.com/shirleyya/p/12600655.html