Probability Theory and Mathematical Statistics-Independent and Uncorrelated

The difference between independence and mutual exclusion is omitted here for better understanding.

 

First, let's look at the definition of covariance:

            The (X, Y) = E {[X - E (X)] [Y - E (Y)]}.

The properties of covariance are:

            The (X, Y) = The (Y, X)

            Cov (aX + b, cY + d) = acCov (X, Y)

            These (X1 + X2, Y) = These (X1, Y) + These (X2, Y) 

            The (X, Y) = E (XY) - E (X) E (Y)

 

If the two variables X and Y are independent of each other, E(XY) = E(X)E(Y), and Cov(X, Y) = E(XY)-E(X)E(Y), thus, when X and When Y is independent, Cov(X, Y) = 0; the opposite is not true, if E(XY) = E(X)E(Y), that is, Cov(X, Y) = 0, it can only show that X and Y are not related. It cannot be said that they are independent. (Note: We are talking about irrelevant refers to whether it is linearly related)

How to understand it? As an example

Draw a two-dimensional rectangular coordinate axis, (X, Y) uniformly distributed on the unit circle X2+Y2=1.

① Then X and Y are not linearly related at this time, that is, the correlation coefficient is 0.

   Text explanation: In terms of linear regression, the intercept of a straight line can be negative, positive or 0. Only when the corresponding x and y satisfy the linear equation can it be said that X and Y are linearly related, but obviously, it can only be satisfied if it passes the origin , The other conditions cannot be satisfied, so X and Y are not related.

   Mathematical explanation: E(X|Y) = E(Y|X) = 0,  so  E(X) = E(Y) = 0, and   E(XY) = E(Y)E(X|Y) = E (X)E(Y|X) = 0, so Cov(X, Y) = E(XY)-E(X)E(Y) = 0

② But the two variables are not independent, because the value of X has an influence on the value of Y.

Guess you like

Origin blog.csdn.net/qq_41371349/article/details/106060634