Introduction to Covariance Matrix (Covariance Matrix)

Covariance matrix definition

First of all, we need to understand that covariance is actually used in probability theory and statistics to measure the overall error of two variables. Of course, variance is a special case of covariance, that is, when two variables are the same. It represents the overall error of two variables, which is different from the variance that only represents the error of one variable. If the change trend of two variables is consistent, that is, if one of them is greater than its own expected value, and the other is also greater than its own expected value, then the covariance between the two variables is positive. If two variables have opposite trends, that is, one of them is greater than its expected value and the other is less than its own expected value, then the covariance between the two variables is negative.

In statistics and probability theory, each element of the covariance matrix is ​​the covariance between the elements of each vector, which is a natural extension from scalar random variables to high-dimensional random vectors.

insert image description here
insert image description here
insert image description here
insert image description here
insert image description here

example

The covariance matrix can be seen everywhere in statistics and machine learning. Generally speaking, it can be regarded as two parts: variance and covariance, that is, the variance constitutes the elements on the diagonal, and the covariance constitutes the elements on the off-diagonal. To put it bluntly, the variance and covariance are expressed in a matrix. This reflects the linkage relationship between variables.

insert image description here
insert image description here

おすすめ

転載: blog.csdn.net/xiaojinger_123/article/details/130749074
おすすめ