"Probability and Statistics" multiple random variables

wedge

The first two discrete and continuous random variables we are discussing is a single variable, but in reality, a test often involves multiple random variables. It refers to a plurality of random variables called a random variable under a plurality of test result produced by the same. These values ​​of the random variable is determined from the test results, they are interrelated values ​​exist. Here we first discrete random variables, for example, the column will be distributed discrete random variables and expectations extended to the case of multiple random variables, and further discussion of multivariate random variables independent of conditions and important concepts on this basis.

Well, now we assume that the test is no longer only a random variable, but two random variables X and Y, while describing the value of the probability of the two of them, in what way we use?

Joint distribution column

Based on the concept of a column discrete random variable distribution have said before, the joint distribution of columns is incorporated herein by multivariate random variable with P , the Y X- represents them. Set (x, y) is the random variables X and Y of a set of possible values. Thus the corresponding (x, y) of the probability mass is defined as the probability of the event {X = x, Y = y } is:

P X-, the Y  = P (= {X-X, the Y} = Y), i.e. satisfying {X = x} and the probability of the event {Y = y} is. Well first of all, to actually look at a joint distribution represents the column. Obviously, we can use a two-dimensional table columns to represent the joint distribution of random variables X and Y:

From this table, we can put all the knowledge points listed are joint distribution comb it again:

The first can be obtained from the figure the joint probability of any set of values of the random variables X and Y, for example: P X, Y (X3, Y2) = P (X = X3, Y = Y2) =. 3/20 is

Second, for any event consisting of a set of random variables X and Y are the same, such as defining an event set A = {(x1, y2) , (x3, y2), (x4, y4)}, then obviously, we the total probability can be calculated directly from the event collection column joint distribution: P ((X-, the Y) ∈ a) = [Sigma (X, Y) ∈ a P X-, the Y (X, Y) =. 1 / + 20 is. 3 / 20 + 1/20 = 5/20

Third, and most simple thing, we put in all two-dimensional table of joint probabilities are added, the result must be 1, which also meet the normalization of the probability.

Marginal distribution column

If we pay attention to the event and then set the number of collections, for example the set A set of events for the first column of the table, i.e., A = {(x1, y1), (x1, y2), (x1, y3)}, at this time we calculated the total probability event set a is the probability P (X = x1), for this probability, we call it the marginal probability:

P(X=x1) = 1/20 + 1/20 + 1/20 + 0 = 3/20

Of course, further, if we put all the random variable X marginal probability values ​​are calculated, we can get the distribution of the random variable X Edge column:

P X (x) = P (X = x) = Σ and P (X = x, Y = y) = Σ and P X, Y (x, y)

Simple point, we first find the probability of the random variable X for each value of the edge, the joint probability that the sum of all corresponding column, and then X = X I all edges probabilities together, is marginal distribution of the random variable X column.

Of course, the edge of the random variable Y distribution is similar, we will not repeat them here.

Marginal probability distribution and edge columns of "marginal" What is the meaning? Word description is, the distribution of the random variable X edge of a column and any marginal probability values are only yourself, but not with other random variables (here is a random variable Y) irrelevant.

The joint distribution of the column and the corresponding joint probability of the joint word, the meaning is very clear, and there's value to be, that is determined by all the random variables random variables X and Y.

Conditional distribution column

In the front we have learned, the condition may provide additional information to certain events, as the values ​​of the random variable is also an event. Likewise, the conditions may also take certain values ​​of the random variable to provide additional information. Conditions therefore we are not able to introduce random variable distribution of the column it? Of course it is possible.

Conditions may refer to the occurrence of an event, of course, can also contain other values ​​of random variables. Or look at a map:

Can be found in the case of an event A occurs, the condition of the occurrence of a random variable X is distributed column gives easy, I still remember the expression conditional probability of it, took it over directly applied over it.

P (X = x | A) represents the premise event A occurs, the probability of X = x, can be written as P the X-| A (the X-), the feeling is not very familiar with? But some key points I have to go on to mention, first of all for the random variable X different values x1, x2, x3, ..., xn, {X = x} ∩A incompatible with each other, and their and set the whole event A.

Guess you like

Origin www.cnblogs.com/traditional/p/12588733.html