Mathematical foundations of machine learning (2) - Linear Algebra

Linear Algebra

Matrix operations:

  1. Addition: adding para
  2. Multiplication: Each multiplied by
  3. Matrix Multiplication
  4. Only the square inverse matrix

Matrix guide

A x x = A T {\frac{\partial Ax}{\partial x}=A^T}
seek derivative is transposed coefficient matrix after

Jacobian matrix and Hessian matrix

Writing rules:

  1. A Jacobian matrix :( partial derivative matrix)
    behavior: each function f i f_i Changes
    as: each variable x j x_j The change
  2. Hessian matrix of second order partial derivative matrix :()
    Behavior: The first change factor for each partial derivative function x i x_i
    As: The second change factor for each partial derivative function x j x_j

vector

Operations: multiplication, addition, subtraction, and the inner product
projected
correlation
spatial dimension and group

The main application

PCA, SVD theories

Guess you like

Origin blog.csdn.net/qq_19672707/article/details/90384744