Optimization theory and method study notes

Optimization theory and method study notes

I. Introduction

1 norm

 

 

Frobenius norm:

 

Weighted Frobenius norm weighted L 2 norm (where M is a positive definite symmetric nxn matrix):

 

 Oval vector norm:

 

In particular, we have

 

 

Several important inequalities on norm are: 

 

 

 

 

 2, unconstrained problem of optimality conditions

 

 

 

 

 

 

 

 

 

 

3, a structure optimization method

 

 

 

Second, the one-dimensional search

 

Third, Newton method

1, steepest descent (gradient descent method, referred to as gradient) - P118

 

 

 

 

Convergence: linear convergence

 

2, two step gradient - P127

or

 

 

 

 among them,

 

Convergence: R- superlinear convergence

 

Guess you like

Origin www.cnblogs.com/lucifer1997/p/11525084.html