PT@Total probability formula and Bayesian formula@Posterior probability and belief measure

abstract

  • Total probability formula and bayes formula and their applications
  • Posterior probabilities and belief measures

Complete event group (partition)

  • Let the finite set I = { 1 , ⋯ , n } I=\{1,\cdots,n\}I={ 1,,n } ;TrialEEThe sample space of E isΩ \OmegaOh
  • { B i ; i ∈ I } \{B_i;i\in{I}\} { Bi;iI } satisfies:
    • ⋃ i = 1 n B i = Ω \bigcup_{i=1}^{n}B_i=\Omega i=1nBi=Oh
    • B i B j = ∅ ; i ≠ j B_iB_j=\varnothing;i\neq j BiBj=;i=j
  • 则称 { B i ; i ∈ I } \{B_i;i\in{I}\} { Bi;iI }Ω \OmegaAcomplete event group of Ω , also calledpartition

basic properties

  • Complete case set { B i ; i ∈ I } \{B_i;i\in{I}\}{ Bi;iI } , any sample point of the test (any test result) belongs to and only belongs to a certainB i B_iBi

total probability formula

  • Let the sample space of experiment E be SSS, A A A -EEE 's event

  • {   B i ∣ i ∈ I   } \set{B_i|i\in I} { BiiI} is aSSS的划分, P ( B i ) > 0 , i ∈ I P(B_i)>0,i\in{I} P(Bi)>0,iI,则 P ( A ) = ∑ i = 1 n P ( A ∣ B i ) P ( B i ) P(A)=\sum_{i=1}^{n}P(A|B_i)P(B_i) P(A)=i=1nP(ABi)P(Bi)

    • 那么 P ( A ) = ∑ i ∈ I P ( A ∣ B i ) P ( B i ) P(A)=\sum\limits_{i\in I}P(A|B_i)P(B_i) P(A)=iIP(ABi)P(Bi)
  • prove:

    • Abundant AB i ⊂ B i AB_i\sub B_iABiBi,又 B i B j = ∅ B_iB_j=\varnothing BiBj=,所以 ( A B i ) ( A B j ) = A ( B i B j ) = A ∅ = ∅ (AB_i)(AB_j)=A(B_iB_j)=A\emptyset=\emptyset (ABi)(ABj)=A(BiBj)=A= ,( i ≠ j ) (i\neq{j})(i=j)

    • P ( A ∣ B i ) P ( B i ) = P ( A B i ) P(A|B_i)P(B_i)=P(AB_i) P(ABi)P(Bi)=P(ABi)

    • Proof 1:

      • ∑ i ∈ I P ( A ∣ B i ) P ( B i ) = ∑ i ∈ I P ( A B i ) \sum\limits_{i\in I}P(A|B_i)P(B_i)=\sum\limits_{i\in I}P(AB_i) iIP(ABi)P(Bi)=iIP(ABi)
        • = P ( A B 1 ∪ A B 2 ∪ ⋯ ∪ A B n ) =P(AB_1\cup{AB_2}\cup\cdots\cup{A{B_n}}) =P(AB1AB2ABn)
        • = P ( A ∩ ( ⋃ i ∈ I B i ) ) =P\left(A\cap\left(\bigcup\limits_{i\in I}B_i\right)\right) =P(A(iIBi))
        • = P ( A Ω ) =P(A\Omega)=P ( A Ω )
        • = P ( A ) =P(A) =P(A)
    • Proof 2:

      • A = A S = A ( B 1 ∪ ⋯ ∪ B n ) A=AS=A(B_1\cup\cdots\cup{B_n}) A=AS=A(B1Bn)= A B 1 ∪ ⋯ ∪ A B n AB_1\cup\cdots\cup{A{B_n}} AB1ABn
      • P ( A ) P(A) P(A)= P ( A B 1 ∪ ⋯ ∪ A B n ) P(AB_1\cup\cdots\cup{A{B_n}}) P(AB1ABn)= ∑ i = 1 n P ( A B i ) \sum_{i=1}^{n}P(AB_i) i=1nP(ABi)= ∑ i ∈ I P ( A ∣ B i ) P ( B i ) \sum\limits_{i\in I}P(A|B_i)P(B_i) iIP(ABi)P(Bi)

example

  • A box containing 20 balls

    • The probabilities of containing 0, 1, and 2 defective products are 0.8, 0.1, and 0.1 respectively.

      • Record: A i A_iAi={The number of defective products contained in the box is iii }
        • P ( A 0 ) = 0.8 P(A_0)=0.8 P(A0)=0.8
        • P ( A 1 ) = P ( A 2 ) = 0.1 P(A_1)=P(A_2)=0.1 P(A1)=P(A2)=0.1
    • Remember BBB ={The 4 products selected are allgenuine}

  • Then event BB happensThe probability of B ?

    • A 0 , A 1 , A 2 A_0,A_1, A_2A0,A1,A2It constitutes a division of the sample space of test E {observe the number of genuine balls in all the boxes, and the sample space is {0,1,2}}

    • And event BBTest FFconducted by BF {Observe the number of genuine products taken out of the 4 samples, the sample space is {0,1,2}}

    • It is easy to calculate the following conditional probability based on the classical summary formula (because the sample space at this time is known) (here we do not use the conditional probability formula to expand the calculation, which will circle back)

      • P ( B ∣ A 0 ) = ( 20 4 ) ( 20 4 ) = 1 P(B|A_0)=\frac{\binom{20}{4}}{\binom{20}{4}}=1 P(BA0)=(420)(420)=1

      • P ( B ∣ A 1 ) = ( 19 4 ) ( 20 4 ) = ( 19 ∗ 18 ∗ 17 ∗ 16 ) ∗ ( 4 ∗ 3 ∗ 2 ∗ 1 ) ( 20 ∗ 19 ∗ 18 ∗ 17 ) ∗ ( 4 ∗ 3 ∗ 2 ∗ 1 ) = 4 5 P(B|A_1)=\frac{\binom{19}{4}}{\binom{20}{4}}=\frac{(19*18*17*16)*(4*3*2*1)}{(20*19*18*17)*(4*3*2*1)}=\frac{4}{5} P(BA1)=(420)(419)=(20191817)(4321)(19181716)(4321)=54

        • Among them, Ω A 1 = Ω \Omega_{A_1}=\OmegaOhA1=Oh
      • P ( B ∣ A 2 ) = ( 18 4 ) ( 20 4 ) = 12 19 P(B|A_2)=\frac{\binom{18}{4}}{\binom{20}{4}}=\frac{12}{19} P(BA2)=(420)(418)=1912

    • According to the full probability formula P ( B ) = ∑ i = 1 3 P ( B ∣ A i ) P ( A i ) = 0.943 P(B)=\sum\limits_{i=1}^{3}P(B| A_i)P(A_i)=0.943P(B)=i=13P(BAi)P(Ai)=0.943

    • Note: Test FF hereThe sample space of F happens to overlap with the sample space of E, but if there are more than 4 defective products in the box, thenFFThe sample space of F is 0, 1, 2, 3, 4 {0,1,2,3,4}0,1,2,3,4 is included inEEof the sample space of E

Bayesian formula

  • The Bayes formula is based on the above-mentioned total probability formula. It is a more comprehensive formula, but the principle is simple.
  • Let the sample space of experiment E be SSS, A A A -EEE 's event
  • {   B i ∣ i ∈ I   } \set{B_i|i\in I} { BiiI} is aSSS的划分, P ( A ) > 0 , P ( B i ) > 0 , i ∈ I P(A)>0,P(B_i)>0,i\in{I} P(A)>0,P(Bi)>0,iI , 则
    • P ( B i ∣ A ) P(B_i|A) P(BiA)= P ( A B i ) P ( A ) \frac{P(AB_i)}{P(A)} P(A)P(ABi)= P ( A ∣ B i ) P ( B i ) ∑ j = 1 n P ( A ∣ B j ) P ( B j ) \frac{P(A|B_i)P(B_i)}{\sum_{j=1}^{n}P(A|B_j)P(B_j)} j=1nP(ABj)P(Bj)P(ABi)P(Bi); i = 1 , ⋯   , n i=1,\cdots,n i=1,,n ;This formula is bayes formula
  • Proof: Based on conditional probability, multiplication theorem, total probability formula, Bayes' formula is obviously established
    • P ( B i ∣ A ) P(B_i|A) P(BiA)= P ( A B i ) P ( A ) \frac{P(AB_i)}{P(A)} P(A)P(ABi)
    • P ( A B i ) P(AB_i) P(ABi)= P ( A ∣ B i ) P ( B i ) P(A|B_i)P(B_i) P(ABi)P(Bi)
      • Note that in the bayes formula, what we require is P ( B i ∣ A ) P(B_i|A)P(BiA ) , soP ( AB i ) P(AB_i)P(ABi)= P ( B i ∣ A ) P ( A ) P(B_i|A)P(A) P(BiA ) P ( A ) is not appropriate, otherwise the formula cannot be calculated effectively
      • And therefore we should use P ( AB i ) P(AB_i)P(ABi)= P ( A ∣ B i ) P ( B i ) P(A|B_i)P(B_i) P(ABi)P(Bi) next calculation
    • P ( A ) = ∑ i = 1 n P ( A ∣ B i ) P ( B i ) P(A)=\sum_{i=1}^{n}P(A|B_i)P(B_i) P(A)=i=1nP(ABi)P(Bi)

example

  • Source of defective products: Suppose a batch of parts comes from three suppliers

    • supplier Defective rate Purchase share
      1 0.02 0.15
      2 0.01 0.80
      3 0.03 0.05
  • Test content: Extract one piece from the parts

    • A={The product received is defective}
    • B i B_i Bi={Defective parts come from iii manufacturers}
      • P ( B 1 ) P(B_1) P(B1)= 0.15 0.15 0.15; P ( A ∣ B 1 ) P(A|B_1) P(AB1)=0.02
      • P ( B 2 ) P(B_2) P(B2)=0.80; P ( A ∣ B 2 ) P(A|B_2) P(AB2)=0.01
      • P ( B 3 ) P(B_3) P(B3)=0.05; P ( A ∣ B 3 ) P(A|B_3) P(AB3)=0.03
  • Find the probability that the sample is defective:

    • Total approximation formula: P ( A ) = ∑ i = 1 3 P ( A ∣ B i ) P ( B i ) P(A)=\sum\limits_{i=1}^{3}P(A|B_i )P(B_i)P(A)=i=13P(ABi)P(Bi)= 0.02 ∗ 0.15 + 0.01 ∗ 0.80 + 0.03 ∗ 0.05 = 0.0125 0.02*0.15+0.01*0.80+0.03*0.05=0.0125 0.020.15+0.010.80+0.030.05=0.0125
  • If you take out one piece and find that it is defective, then it comes from the manufacturer iiWhat is the probability of i (i = 1, 2, 3) (i=1,2,3)(i=1,2,3)

    • According to Bayesian formula:
      • P ( B i ∣ A ) = P ( B i A ) P ( A ) = P ( A ∣ B i ) P ( B i ) P ( A ) P(B_i|A)=\frac{P(B_iA)}{P(A)}=\frac{P(A|B_i)P(B_i)}{P(A)} P(BiA)=P(A)P(BiA)=P(A)P(ABi)P(Bi)
    • It can be calculated respectively:
      • P ( B 1 ∣ A ) = 0.02 ∗ 0.15 0.0125 = 0.24 P(B_1|A)=\frac{0.02*0.15}{0.0125}=0.24 P(B1A)=0.01250.020.15=0.24
      • P ( B 2 ∣ A ) P(B_2|A) P(B2A)= 0.64 0.64 0.64
      • P ( b 3 ∣ A ) P(b_3|A) P(b3A)= 0.12 0.12 0.12

Common forms for opposing events

  • The total probability formula and beyes formula are at n = 2 n=2n=When 2 (event B , B ‾ B,\overline{B}B,Bconstitutes a partition of the sample space) most commonly used
  • At this time there are:
    • P ( A ) = P ( A B ) + P ( A B ‾ ) P(A)=P(AB)+P(A\overline{B}) P(A)=P(AB)+P(AB)= P ( A ∣ B ) P ( B ) + P ( A ∣ B ‾ ) P ( B ‾ ) P(A|B)P(B)+P(A|\overline{B})P(\overline{B}) P(AB)P(B)+P(AB)P(B)
    • P ( B ∣ A ) = P ( A B ) P ( A ) P(B|A)=\frac{P(AB)}{P(A)} P(BA)=P(A)P(AB)= P ( A ∣ B ) P ( B ) P ( A ∣ B ) P ( B ) + P ( A ∣ B ‾ ) P ( B ‾ ) \frac{P(A|B)P(B)}{P(A|B)P(B)+P(A|\overline{B})P(\overline{B})} P(AB)P(B)+P(AB)P(B)P(AB)P(B)

example

  • Suppose there are 2 red balls and 1 white ball in the first box; the second box has half red balls and half white balls.
  • If you first take one ball from each of the two boxes to make a pair, and then choose one ball from the two boxes,
    1. Event CCC : What is the probability of getting a red ball?
    2. Event CC occurredUnder the condition of C , event DDD : What is the probability that this red ball comes from the first box?
  • untie
  • (1)
    • method 1:
      • order A i A_iAi:{The final selected ball comes from iii each box},i = 1 , 2 i=1,2i=1,2
      • From the total probability formula, we know that P ( C ) = ∑ i = 1 2 P ( C ∣ A i ) P ( A i ) P(C)=\sum_{i=1}^{2}P(C|A_i)P (A_i)P(C)=i=12P(CAi)P(Ai)= 2 3 × 1 2 + 1 2 × 1 2 \frac{2}{3}\times{\frac{1}{2}}+\frac{1}{2}\times{\frac{1}{2}} 32×21+21×21= 7 12 \frac{7}{12} 127
    • Method 2:
      • order A i A_iAi:{from iiThe red ball drawn from i box}, i = 1, 2 i=1,2i=1,2
      • B 1 = A 1 A 2 B_1=A_1A_2 B1=A1A2; B 2 = A 1 ‾ A 2 B_2=\overline{A_1}A_2 B2=A1A2; B 3 = A 1 A 2 ‾ B_3=A_1\overline{A_2} B3=A1A2; B 4 = A 1 ‾    A 2 ‾ B_4=\overline{A_1}\;\overline{A_2} B4=A1A2
        • P ( B 1 ) = P ( A 1 ) P ( A 2 ) = 2 3 × 1 2 P(B_1)=P(A_1)P(A_2)=\frac{2}{3}\times{\frac{1}{2}} P(B1)=P(A1)P(A2)=32×21= 1 3 \frac{1}{3} 31
        • Similarly, P ( B 2 ) P(B_2)P(B2)= 1 3 × 1 2 \frac{1}{3}\times{\frac{1}{2}} 31×21= 1 6 \frac{1}{6} 61; P ( B 3 ) P(B_3) P(B3)= 2 3 × 1 2 \frac{2}{3}\times{\frac{1}{2}} 32×21= 1 3 \frac{1}{3} 31; P ( B 4 ) = 1 3 × 1 2 P(B_4)=\frac{1}{3}\times{\frac{1}{2}} P(B4)=31×21= 1 6 \frac{1}{6} 61
      • From the total probability formula: P ( C ) P(C)P(C)= ∑ i = 1 4 P ( C ∣ B i ) P ( B i ) \sum_{i=1}^{4}P(C|B_i)P(B_i) i=14P(CBi)P(Bi)= 1 × 1 3 + 1 2 × 1 6 + 1 2 × 1 3 + 0 × 1 6 1\times{\frac{1}{3}}+\frac{1}{2}\times{\frac{1}{6}}+\frac{1}{2}\times{\frac{1}{3}}+0\times\frac{1}{6} 1×31+21×61+21×31+0×61= 7 12 \frac{7}{12} 127
  • (2)
    • order A i A_iAi:{from iiThe red ball drawn from i box}, i = 1, 2 i=1,2i=1,2
    • method 1:
      • P ( A 1 ∣ C ) P(A_1|C) P(A1C)= P ( C ∣ A 1 ) P ( A 1 ) P ( C ) \frac{P(C|A_1)P(A_1)}{P(C)} P(C)P(CA1)P(A1)
        • P ( C ∣ A 1 ) P(C|A_1) P(CA1)= P ( C ∣ A 1 A 2 ) P ( A 1 A 2 ) + P ( C ∣ A 1 A 2 ‾ ) P ( A 1 A ‾ ) P(C|A_1A_2)P(A_1A_2)+P(C|A_1\overline{A_2})P(A_1\overline{A}) P(CA1A2)P(A1A2)+P(CA1A2)P(A1A)= 1 × 1 2 + 1 2 × 1 2 1\times{\frac{1}{2}}+\frac{1}{2}\times{\frac{1}{2}} 1×21+21×21= 3 4 \frac{3}{4} 43
      • P ( A 1 ∣ C ) P(A_1|C) P(A1C)= 3 4 × 2 3 7 / 12 \frac{\frac{3}{4}\times{\frac{2}{3}}}{7/12} 7/1243×32= 6 7 \frac{6}{7} 76
    • Method 2:
      • A 1 = A 1 A 2 + A 1 A 2 ‾ A_1=A_1A_2+A_1\overline{A_2} A1=A1A2+A1A2
      • P ( A 1 ∣ C ) P(A_1|C)P(A1C)= P ( [ A 1 A 2 + A 1 A 2 ‾ ] ∣ C ) P([A_1A_2+A_1\overline{A_2}]|C) P([A1A2+A1A2]C)= P ( A 1 A 2 ∣ C ) P ( A 1 A 2 ‾ ∣ C ) P(A_1A_2|C)P(A_1\overline{A_2}|C) P(A1A2C)P(A1A2C)= P ( A 1 A 2 C ) / P ( B ) + P ( A 1 A 2 ‾ C ) / P ( B ) P(A_1A_2C)/P(B)+P(A_1\overline{A_2}C)/P(B) P(A1A2C)/P(B)+P(A1A2C)/P(B)
        • = P ( C ∣ A 1 A 2 ) P ( A 1 A 2 ) / P ( B ) P(C|A_1A_2)P(A_1A_2)/P(B) P(CA1A2)P(A1A2)/P(B)+ P ( C ∣ A 1 A 2 ‾ ) P ( A 1 A 2 ‾ ) / P ( B ) P(C|A_1\overline{A_2})P(A_1\overline{A_2})/P(B) P(CA1A2)P(A1A2)/P(B)
        • = 6 7 \frac{6}{7} 76

example

  • There are two boxes, the first box contains nnn pieces of first-class goods, the second box hasmmm products including only 1 first-class product
  • Choose one of the two boxes, and then take out one product from the box one after another
    1. What is the probability that the part taken out the first time is a first-class product? What is the probability that the part taken out the second time is a first-class product?
    2. If it is known that the first item taken out is a first-class product, what is the probability that the second item is still a first-class product?
  • (1)
    • order A i A_iAi:The selected box is the iii boxes (orAAA : The first box is selected,A ‾ \overline{A}A:The second box is selected)
    • Order B i B_iBi:Part iiThe item taken out i time is the first-class product, i = 1, 2 i=1,2i=1,2
      • P ( B 1 ) P(B_1) P(B1)= 1 2 × 1 + 1 2 × 1 m \frac{1}{2}\times{1}+\frac{1}{2}\times{\frac{1}{m}} 21×1+21×m1= m + 1 2 m \frac{m+1}{2m} 2m _m+1
      • P ( B 2 ) P(B_2) P(B2)= P ( B 2 ∣ A 1 ) P ( A 1 ) + P ( B 2 ∣ A 2 ) P ( A 2 ) P(B_2|A_1)P(A_1)+P(B_2|A_2)P(A_2) P(B2A1)P(A1)+P(B2A2)P(A2)
        • = P ( B 2 ∣ A 1 B 1 ) P ( A 1 B 1 ) + P ( B 2 ∣ A 1 B 1 ‾ ) P ( A 1 B 1 ‾ ) P(B_2|A_1B_1)P(A_1B_1)+P(B_2|A_1\overline{B_1})P(A_1\overline{B_1}) P(B2A1B1)P(A1B1)+P(B2A1B1)P(A1B1)+ P ( B 2 ∣ A 2 B 1 ) P ( A 2 B 1 ) + P ( B 2 ∣ A 2 B 1 ‾ ) P ( A 2 B 1 ‾ ) P(B_2|A_2B_1)P(A_2B_1)+P(B_2|A_2\overline{B_1})P(A_2\overline{B_1}) P(B2A2B1)P(A2B1)+P(B2A2B1)P(A2B1)
        • P ( B 2 ∣ A 1 ) P ( A 1 ) P(B_2|A_1)P(A_1) P(B2A1)P(A1)= 1 × 1 2 1\times{\frac{1}{2}} 1×21
        • P ( B 2 ∣ A 2 ) P ( A 2 ) P(B_2|A_2)P(A_2) P(B2A2)P(A2)= P ( B 2 ∣ A 2 B 1 ) P ( A 2 B 1 ) + P ( B 2 ∣ A 2 B 1 ‾ ) P ( A 2 B 1 ‾ ) P(B_2|A_2B_1)P(A_2B_1)+P(B_2|A_2\overline{B_1})P(A_2\overline{B_1}) P(B2A2B1)P(A2B1)+P(B2A2B1)P(A2B1)= 0 + P ( B 2 ∣ A 2 B 1 ‾ ) P ( B 1 ‾ ∣ A 2 ) P ( A 2 ) 0+P(B_2|A_2\overline{B_1})P(\overline{B_1}|A_2)P(A_2) 0+P(B2A2B1)P(B1A2)P(A2)= 1 2 m \frac{1}{2m} 2m _1
        • P ( B 2 ) P(B_2) P(B2)= 1 2 + 1 2 m \frac{1}{2}+\frac{1}{2m} 21+2m _1= 1 + m 2 m \frac{1+m}{2m} 2m _1+m
      • In fact, if the box selected is the first box, then the first-class product will be taken out both times; if the box selected is the second box, based on the drawing, the first-class product and the first-class product will be drawn in the first draw. The probability of winning the first-class product in the second draw is 1 m \frac{1}{m}m1, so the first-class product drawn for the first time and the second time are always equal.
  • (2)
    • method 1:
      • P ( B 2 ∣ B 1 ) P(B_2|B_1) P(B2B1)= P ( B 1 ∣ B 2 ) P ( B 2 ) P ( B 1 ) \frac{P(B_1|B_2)P(B_2)}{P(B_1)} P(B1)P(B1B2)P(B2)
        • P ( B 1 ∣ B 2 ) P(B_1|B_2) P(B1B2)= P ( B 1 ∣ A 1 B 2 ) P ( A 1 B 2 ) P(B_1|A_1B_2)P(A_1B_2) P(B1A1B2)P(A1B2)+ P ( B 1 ∣ A 2 B 2 ) P ( A 2 B 2 ) P(B_1|A_2B_2)P(A_2B_2) P(B1A2B2)P(A2B2)= P ( B 1 ∣ A 1 B 2 ) P ( B 2 ∣ A 1 ) P ( A 1 ) P(B_1|A_1B_2)P(B_2|A_1)P(A_1) P(B1A1B2)P(B2A1)P(A1)+ P ( B 1 ∣ A 2 B 2 ) P ( B 2 ∣ A 2 ) P ( B 2 ) P(B_1|A_2B_2)P(B_2|A_2)P(B_2) P(B1A2B2)P(B2A2)P(B2)= 1 × 1 × 1 2 + 0 1\times1\times{\frac{1}{2}}+0 1×1×21+0= 1 2 \frac{1}{2} 21
        • P ( B 2 ∣ B 1 ) = 1 2 P(B_2|B_1)=\frac{1}{2} P(B2B1)=21
      • This method uses the Bayes formula for direct calculation, but you will find that it is not too straightforward to calculate the two probabilities of the molecule.
      • The bayes formula is a model for solving this kind of problem, but it is not necessarily the simplest. Constructing more appropriate sample spaces and events may simplify the solution process.
    • Method 2:
      • Let event DDD : B 1 B_1occursB1B 2 B_2 occurs under the conditionsB2
      • Then from the total probability formula: P ( D ) P(D)P(D)= P ( D ∣ A 1 ) P ( A 1 ) + P ( D ∣ A 2 ) P ( A 2 ) P(D|A_1)P(A_1)+P(D|A_2)P(A_2) P(DA1)P(A1)+P(DA2)P(A2)= 1 × 1 2 + 0 × 1 2 1\times{\frac{1}{2}}+0\times{\frac{1}{2}} 1×21+0×21= 1 2 \frac{1}{2} 21
      • In other words, event DDThe occurrence of D depends on A 1 A_1A1Or A 2 A_2A2occurs, A 1 A_1A1If it happens, DDD must happen, otherwiseDDD will definitely not happen, thism, nm, nm,It doesn't matter
      • It can be seen that method 2 is much simpler and more direct
    • Wrong solution:
      • P ( B 2 ∣ B 1 ) P(B_2|B_1) P(B2B1)= P ( B 1 B 2 ) P ( B 1 ) \frac{P(B_1B_2)}{P(B_1)} P(B1)P(B1B2)
        • = P ( B 1 B 2 ∣ A 1 ) P ( A 1 ) + P ( B 1 B 2 ∣ A 2 ) P ( A 2 ) P ( B 1 ) \frac{P(B_1B_2|A_1)P(A_1)+P(B_1B_2|A_2)P(A_2)}{P(B_1)} P(B1)P(B1B2A1)P(A1)+P(B1B2A2)P(A2)= m m + 1 \frac{m}{m+1} m+1m
        • This result is about mmThe formula of m , whenmmWhen m is very large,P ( B 2 ∣ B 1 ) P(B_2|B_1)P(B2B1) will almost certainly happen, which is obviously wrong
    • Sample space analysis:
      • S={ ( a 1 , a 2 ) , ⋯   , ( a 1 , a n ) , ⋯   , ( a n − 1 a n ) (a_1,a_2),\cdots,(a_1,a_n),\cdots,(a_{n-1}a_{n}) (a1,a2),,(a1,an),,(an1an), ( b 1 , b 2 ) , ⋯   , ( b 1 , b m ) , ⋯   , ( b m − 1 b m ) (b_1,b_2),\cdots,(b_1,b_m),\cdots,(b_{m-1}b_{m}) (b1,b2),,(b1,bm),,(bm1bm)}
        • 其中 a i , i = 1 , ⋯   , n a_i,i=1,\cdots,n ai,i=1,,n represents the parts in the first box;bj , j = 1 , ⋯ , m b_j,j=1,\cdots,mbj,j=1,,m represents the parts in the second box
        • ( a i 1 , a i 2 ) , i 1 , i 2 = 1 , ⋯   , n (a_{i_1},a_{i_2}),i_1,i_2=1,\cdots,n (ai1,ai2),i1,i2=1,,n totalA n 2 A_n^2An2个样本点; ( b i 1 , b i 2 ) , i 1 , i 2 = 1 , ⋯   , m (b_{i_1},b_{i_2}),i_1,i_2=1,\cdots,m (bi1,bi2),i1,i2=1,,m has a total ofA m 2 A_m^2Am2sample points;
        • These two types of sample points do not occur with equal probability, but similar sample points occur with equal probability, respectively 1 2 × 1 A n 2 \frac{1}{2}\times\frac{1}{A_n ^2}21×An21; 1 2 × 1 A m 2 \frac{1}{2}\times{\frac{1}{A_m^2}} 21×Am21
        • But these are not important for this question
      • Let event D 1 D_1D1: B 1 , B 2 B_1,B_2 B1,B2It all happened. It can be seen from the sample space that D = D 1 = B 1 B 2 D=D_1=B_1B_2D=D1=B1B2, so P ( D ) P(D)P(D)= P ( D 1 ) P(D_1) P(D1)= 1 2 \frac{1}{2} 21
  • Note: If the mm in the second boxm samples containkkk first-class goods, then the eventDDWhat is the probability of D ?
    • P ( D ) = P ( D ∣ A 1 ) P ( A 1 ) + P ( D ∣ A 2 ) P ( A 2 ) P(D)=P(D|A_1)P(A_1)+P(D|A_2)P(A_2) P(D)=P(DA1)P(A1)+P(DA2)P(A2)= 1 × 1 2 + k − 1 m − 1 × 1 2 1\times{\frac{1}{2}}+\frac{k-1}{m-1}\times\frac{1}{2} 1×21+m1k1×21= m + k − 2 2 ( m − 1 ) \frac{m+k-2}{2(m-1)} 2(m1)m+k2

example

  • Suppose there are registration forms for 3 regions with 10, 15, and 25 candidates each. The registration forms for girls are 3, 7, 5, 3, 7, and 5 respectively.3,7,5 servings
  • Now randomly select a region, and then take out 2 copies from the registration form in that region.
    1. What is the probability that the first table is a girl?
    2. It is known that the second part is a boy's table, what is the probability of the first part being a girl's table?
  • order A i A_iAi: When the selected area iiWard i , i = 1 , 2 , 3 i=1,2,3i=1,2,3; B j B_j Bj: jjj registration forms are for girls,j = 1, 2 j=1,2j=1,2
  • The ratio P ( A 1 ) = P ( A 2 ) = P ( A 3 ) = 1 3 P(A_1)=P(A_2)=P(A_3)=\frac{1}{3}P(A1)=P(A2)=P(A3)=31
    • P ( B 1 ∣ A 1 ) = 3 10 P(B_1|A_1)=\frac{3}{10} P(B1A1)=103; P ( B 1 ∣ A 2 ) = 7 15 P(B_1|A_2)=\frac{7}{15} P(B1A2)=157; P ( B 1 ∣ A 3 ) = 5 25 P(B_1|A_3)=\frac{5}{25} P(B1A3)=255
  • (1): From the total probability formula: P ( B 1 ) P(B_1)P(B1)= ∑ i = 1 3 P ( B 1 ∣ A i ) P ( A i ) \sum_{i=1}^{3}P(B_1|A_i)P(A_i) i=13P(B1Ai)P(Ai)= 29 90 \frac{29}{90} 9029
  • (2): Let DDD: B 2 B_2 B2When it happens B 1 B_1B1occur
    • P ( D ) = ∑ i = 1 3 P ( D ∣ A i ) P ( A i ) P(D)=\sum_{i=1}^{3}P(D|A_i)P(A_i) P(D)=i=13P(DAi)P(Ai)= 3 9 × 1 3 + 3 9 × 7 14 \frac{3}{9}\times{\frac{1}{3}}+\frac{3}{9}\times{\frac{7}{14}} 93×31+93×147+ 3 9 × 5 24 \frac{3}{9}\times{\frac{5}{24}} 93×245= 25 72 \frac{25}{72} 7225
    • Method 2: P ( D ) P(D)P(D)= ∑ i = 1 3 P ( B 1 ∣ A i ) P ( A i ) \sum_{i=1}^{3}P(B_1|A_i)P(A_i) i=13P(B1Ai)P(Ai)= 25 72 \frac{25}{72} 7225

Prior probability and posterior probability

example

  • Machine and product qualification rate issues

  • Assume that when the machine is normal, the pass rate of the products produced is 0.9, otherwise the pass rate is 0.3

  • If the machine is turned on, the normal probability is 0.75 ( prior probability )

  • On a certain day, the first product of the machine is qualified. What is the probability that the machine is normal?

  • analyze:

    • A={The first product is qualified}
    • B={machine is normal}
    • The required probability expression is: P ( B ∣ A ) = ? P(B|A)=?P(BA)=?
  • According to the assumptions:

    • P ( A ∣ B ) = 0.9 ; P ( A ∣ B ‾ ) = 0.3 P(A|B)=0.9;P(A|\overline{B})=0.3 P(AB)=0.9;P(AB)=0.3
    • P ( B ) = 0.75 , P ( B ‾ ) = 0.25 P(B)=0.75,P(\overline{B})=0.25 P(B)=0.75,P(B)=0.25
      • B , B ‾ B,\overline{B} B,BConstitutes a division of the sample space (that is, the machine is either normal or abnormal)
    • From the total probability formula P ( A ) = P ( A ∣ B ) P ( B ) + P ( A ∣ B ‾ ) P ( B ‾ ) P(A)={P(A|B)P(B)+P (A|\overline{B})P(\overline{B})}P(A)=P(AB)P(B)+P(AB)P(B)= 0.9 ∗ 0.75 + 0.3 ∗ 0.25 = 0.75 0.9*0.75+0.3*0.25=0.75 0.90.75+0.30.25=0.75
    • Then according to Bayes formula, P ( B ∣ A ) = P ( A ∣ B ) P ( B ) P ( A ) P(B|A)=\frac{P(A|B)P(B)}{P( A)}P(BA)=P(A)P(AB)P(B)= 0.9 ∗ 0.75 0.75 = 0.9 \frac{0.9*0.75}{0.75}=0.9 0.750.90.75=0.9,
    • That is, under the condition that the first product is qualified, the posterior probability of the machine being normal is 0.9
  • Posterior probability is a modification of prior probability

  • The interpretation of posterior probability and prior probability is divided into two schools

    • Objectivist: All first products are qualified within 100 days, on average by 90 days the machine is normal
    • Subjective school: reflects people’s different subjective beliefs about the state of the machine before and after the experiment

Probability as a measure of people's beliefs about objective events

  • Take Aesop’s fable The Boy Who Cried Wolf as an example.
  • A A A ={Child lies};BBB = {Child is trustworthy}, assuming that atrustworthychild is relatively unlikely to lie, you might as well set this probability to 0.1, that is,P ( A ∣ B ) P(A|B)P ( A B ) =0.1; conversely, the probability of an untrustworthy child telling a lie is 0.5, that is,P ( A ∣ B ‾ ) = 0.5 P(A|\overline{B})=0.5P(AB)=0.5
  • Assume that the probability of a villager meeting a trustworthy child is 0.8, that is, P ( B ) P(B)P(B)=0.8
  • Then the probability of the child saying something panic is P ( A ) P(A)P(A)= P ( A ∣ B ) P ( B ) + P ( A ∣ B ‾ ) P ( B ‾ ) P(A|B)P(B)+P(A|\overline{B})P(\overline{B}) P(AB)P(B)+P(AB)P(B)= 0.1 × 0.8 + 0.5 × 0.2 0.1\times{0.8}+0.5\times{0.2} 0.1×0.8+0.5×0.2=0.18
  • Now assume that after a child tells a lie once, calculate the probability that the child is trustworthy according to the Bayes formula.
    • P ( B ∣ A ) P(B|A) P(BA)= P ( B A ) P ( A ) \frac{P(BA)}{P(A)} P(A)P ( B A )= P ( A ∣ B ) P ( B ) P ( A ∣ B ) P ( B ) + P ( A ∣ B ‾ ) P ( B ‾ ) \frac{P(A|B)P(B)}{P(A|B)P(B)+P(A|\overline{B})P(\overline{B})} P(AB)P(B)+P(AB)P(B)P(AB)P(B)= 0.1 × 0.8 0.18 \frac{0.1\times{0.8}}{0.18} 0.180.1×0.8= 4 9 ≈ 0.444 \frac{4}{9}\approx{0.444} 940.444
    • That is, the posterior probability that the child is credible is reduced to 0.444
  • If the child lies again, let P(B) = 0.444 P(B)=0.444P(B)=0.444 Calculate the posterior probability again as above:
    • P ( B ∣ A ) P(B|A) P(BA)= 0.444 × 0.1 0.444 × 0.1 + 0.566 × 0.5 \frac{0.444\times{0.1}}{0.444\times{0.1}+0.566\times{0.5}} 0.444×0.1+0.566×0.50.444×0.1= 0.138 0.138 0.138
  • It can be seen that after the child has behaved twice, the posterior probability of his/her credibility has been reduced to 0.138, giving the impression that he is almost an untrustworthy person.

Guess you like

Origin blog.csdn.net/xuchaoxin1375/article/details/133128913