Ridge Linear Regression Estimation Invertible

Reference: https://math.stackexchange.com/questions/2447060/prove-that-the-regularization-term-in-rls-makes-the-matrix-invertible

θ ^ = ( X T X + λ I ) 1 X y \hat{\theta} = (X^TX + \lambda I)^{-1}Xy
For any vector v R d v \in \mathbb{R}^d , we have v T X T X v = ( X v ) T ( X v ) > = 0 v^TX^TXv = (Xv)^T(Xv) >= 0 , so it is a positive semi-definite. Also, we know λ > 0 \lambda >0 so that λ I \lambda I is already a positive definite matrix.
Therefore,
v T ( X T X + λ I ) v = ( X v ) T ( X v ) + λ v T I v > = v T ( λ ) v > 0 v^T(X^TX + \lambda I)v=(Xv)^T(Xv) + \lambda v^TIv >= v^T(\lambda)v>0
is a positve definite and hence invertible.

猜你喜欢

转载自blog.csdn.net/weixin_32334291/article/details/89197984