Optimization Theory - (1) Introduction 2 Preliminary Mathematics Knowledge

1. The norm of vectors and the positive definiteness of matrices

1. Basic concepts
Insert image description here

2. Inner product and norm
Insert image description here

  • norm, is a function with the concept of "length". In linear algebra, functional analysis, and related mathematics, a function that gives all vectors in a vector space a nonzero positive length or magnitude. Use ||x||
  • Regarding norms and inner products, we have two important inequalities
    • Triangle inequality:
      ∣ ∣ x + y ∣ ∣ < = ∣ ∣ x ∣ ∣ + ∣ ∣ y ∣ ∣ ||x+y||<=||x||+||y||∣∣x+y∣∣<=∣∣x∣∣+∣∣y∣∣
    • Cauchy's inequality:
      ∣ x T ∣ < = ∣ ∣ x ∣ ∣ ∣ ∣ y ∣ ∣ |x^T|<=||x||| |y||xT<=∣∣x∣∣∣∣y∣∣Insert image description here
  • definition
    Insert image description here
  • theorem
    Insert image description here

3.Positive definite matrix

  • Definition: Let A be a real symmetric square matrix of order n. In
    Insert image description hereaddition to any n-dimensional non-zero vector x, A is called an indefinite matrix.
  • nature
    Insert image description here

2. Gradient of multivariate function, Hesse matrix and Taylor formula

Insert image description here
1. Definition of gradient
Insert image description here
2. Directional derivative
Insert image description here
Insert image description here
Insert image description here
Insert image description here
Insert image description here
Insert image description here
3. Gradient properties
Insert image description here
4. Hesse matrix
Insert image description here
Insert image description here
5. Gradient formulas of several commonly used vector-valued functions
Insert image description here
6. Differentiability of vector-valued functions
Insert image description here
Insert image description here
7. Taylor expansion of n-ary functions at one point
Insert image description here

Guess you like

Origin blog.csdn.net/m0_63853448/article/details/127038586