Probability theory (discrete) minimalist entry

Classical Probability Model

Defined
\ [P (A) = \ {n-FRAC {m}} \]
m: A number of events contained substantially.

n: the total number of basic events

The basic events: the sample space in the smallest event can not be subdivided.

symbol

\ (\ Omega \) represents the sample space. $ \ Phi \ (represented impossible event. \) \ Overline {A} \ (represented by \) A \ (opposite events, \) AB \ (represented by event \) A, and $ \ (B \) occur simultaneously.

Several formula
\ [P (A) = 1
-P (\ overline A) \] Description: Event \ (A \) probability is \ (l \) event \ (A \) antagonism probability events .
\ [P (A + B)
= P (A) + P (B) -P (AB) \] Description: Event \ (A \) or \ (B \) the probability of an event \ (A \) occurs plus the probability of the event \ (B \) the probability of occurrence of an event minus \ (a \) and \ (B \) probability of occurring simultaneously. (Inclusion and exclusion)
\ [P (A_2 A_1 + + ... + A_n) = \ sum_ {I} = ^ nP. 1 (A_i) - \. 1 sum_ {\ leqslant I <J \} P n-leqslant (A_iA_j) + \ sum_ {1 \ leqslant i <
j <k \ leqslant n} P (A_iA_jA_k) ... + (- 1) ^ {n-1} P (\ Pi_ {i = 1} ^ nA_i) \] explanation: on a promotion, the same inclusion and exclusion.

If \ (A \) and \ (B \) mutually incompatible ( \ (AB = \ Phi \) ), then \ (P (A + B) = P (A) + P (B) \)

Description: the first is a special case. Since \ (A \) and \ (B \) mutually exclusive, so \ (P (AB) \) is \ (0 \) .

If \ (A \) and \ (B \) independently of each other, then the \ (P (AB) = P (A) \ times P (B) \)

Conditional Probability

\ (P (B | A) \) represents \ (A \) in the event of, \ (B \) probability of occurrence.

Clearly, we have:
\ [\} the aligned the begin {P (AB) = & P (A) P (B | A) = P \\ & (B) P (A | B) \ the aligned End {} \]
, etc. is equivalent to
\ [P (B | A)
= \ frac {P (AB)} {P (A)} \] If \ (A \) and \ (B \) independently of each other, then
\ [\ begin {aligned} & P (AB) = P ( A) P (B) \\ \ Leftrightarrow & P (A | B) = P (A) \\ \ Leftrightarrow & P (B | A) = P (B) \ end {aligned } \]

Full probability formula

Complete Event Group

Complete event group is a set of events \ (A_i (n-I =. 1 ...) \) : \
[\ bigcup_ = {I}. 1 nA_i = ^ \ Omega, \ FORALL I \ J NEQ, A_iA_j = \ Phi \ ]
in short, the complete event group is a division of the sample space.

Full probability formula

In the Complete Event Group \ (A_i \) , the total probability formula has:
\ [P (B) = \ sum_ {I} = ^ nP. 1 (A_i) P (B | A_i) \]
sensibility can be appreciated.

Inverse probability formula

\[ P(A_i|B)=\frac{P(A_iB)}{P(B)}=\frac{P(A_i)P(B|A_i)}{\sum P(A_i)P(B|A_i)} \]

This feeling is not well understood, but quite useless.

Bernoulli probability model

\ (n \) independent repeat experiments, if the event \ (A \) the probability of occurrence of \ (the p-\) , then the event \ (A \) just happened \ (k \) times the probability that
\ [P_n (k ) = {n \ choose k} p ^ k (1-p) ^ k \, \, \, \, \, \, \, \, \, \, \, (k = 0,1, ... , n) \]

Random Variables

We give an event \ (\ omega \ in \ Omega \) assigned \ (the X-(\ Omega) \) (abbreviated as \ (X \) ), then \ (X \) is a random variable.

Distribution column

\(X\) \(x_1\) \(x_2\) \(\cdots\) \(x_n\)
\(P\) \(p_1\) \(p_2\) \(\cdots\) \(p_n\)

Distribution law

\(P(X=x_i)=p_i\)

Obviously, there \ (P_i \ geqslant 0, \ SUM P_i = 1 \) .

Distribution function
\ [F (X) = P (x \ leqslant X) \]

Some distribution of discrete variables

Binomial \ (X-\ SIM B (n-, P) \)
\ [P (X = K) = {n-\} the Choose K ^ P K (. 1-P) NK ^ {} \, \, \, \ , \, \, \, \ , \, \, \, (k = 0,1, ..., n) $ \]
two distribution : binomial distribution points \ (n = 1 \) of Happening.

Poisson distribution \ (X-\ SIM \ Pi (\ the lambda) \)
\ [P (X = K) = \ {FRAC \ the lambda K ^ {K}! ^ {E} - \ the lambda} \, \, \, \, \, \, \,
\, \, \, \, \, (k = 0,1,2, ...) \] Poisson distribution applies to sparse probability event.

Geometric Distribution \ (X \ sim G (p ) \)

In the first experiment exactly \ (k \) (before the probability of a successful sequence \ (k-1 \) failures).
\ [P (x = k) = (1-p) ^ {k-1} p \]
hypergeometric distribution \ (X-\ SIM H (n-, N_l, N) \)
\ [P (X = K) = \ frac {{N_1 \ choose k } {N-N_1 \ choose n - k}} {N \ choose n} \, \, \, \, \, \, \, \, \, k = 0,1, ..., min (n, N1)
\] hypergeometric distribution can be understood: from \ (N \) black ( \ (N_l \) a) or white ball at random \ (n-\) balls exactly there \ (k \) a black probability. When (N \ gg n \) \ time, similar to the binomial distribution.

Wherein the digital random variable

Mathematical Expectation

Since the series of problems involving mathematical expectation, but to explain here.

Binomial distribution \ (E (x) = np \)

Two distribution \ (E (x) = p \)

Poisson distribution \ (E (x) = \ lambda \)

Geometric Distribution \ (E (x) = \ frac {1} {p} \)

Some of the desired formula:
\ [E (X) = \ E \\ x_ip_i SUM (C) = E \\ C (KX + B) = kE (X) \\ B + E (X \ Y PM) = E ( x) \ pm E (y)
\\ E [f (x)] = \ sum f (x_i) p_i \] If \ (X \) , \ (Y \) independently, and \ (E (xy) = E (x) E (y) \)

variance

方差\(D(x)=E[(x-E(x))^2]=E(x^2)-(E(x))^2=\sum x_i^2p_i - (\sum x_i p_i) ^2\)

Binomial \ (D_x = np (1- p) \)

Poisson distribution \ (D_x = \ lambda \)

Geometric Distribution \ (D_x = \ frac {1 -p} {p ^ 2} \)

Variance properties : \ (D (C) = 0 \) , \ (D (X + C) = D (X) \) , \ (D (CX) = C ^ 2D (X) \) , \ (D ( X \ Y PM) = D (X) + D (Y) \) , \ (D (X) = 0 \ Leftrightarrow P (X = C). 1 = \)

Guess you like

Origin www.cnblogs.com/chy-2003/p/11469686.html