A posteriori probability and conditional probability (to be continued)

A priori probability: Probability based on past experience analysis, popularly is based on statistics and laws.
Posterior probability: It is based on the reason for the result. For example, knowing that a product is defective is the probability that it comes from workshop A, which can be obtained through the Bayes formula.

\[P(A|B) = P(A) \times \frac{P(B|A)}{P(B)} \]

Among them \ (P (A | B) \) is called the posterior probability, \ (P (A) \) is called the prior probability, \ (\ frac {P (B | A)} {P (B)} \)囧Do a likelihood function or adjustment factor.

Posterior probability = prior probability x adjustment factor

example

There are three workshops A, B and C to produce the same product, the output is 25%, 35%, 40%, the defective rate is 5%, 4%, 2%.
Now a defective product is detected from the produced product, and it is judged that it is the probability of production in workshop A.

solution:

P(A) = 0.25, P(B) = 0.35, P(C)=0.4

Defective D:
P (D | A) = 0.05, P (D | B) = 0.04, P (D | C) = 0.02

Factory-wide defective rate:
P (D) = P (D | A) P (A) + P (D | B) P (B) + P (D | C) P (C) = 0.05 x 0.25 + 0.04 x 0.35 + 0.02 x 0.4 = 0.0345

It is the probability of production in workshop A, according to Bayes' theorem:

\[P(A|D) = \frac{ P(A)P(D|A) }{ P(D) }=\frac{0.25 \times 0.05}{0.0345}=0.362 \]

Prior Probability: Probability based on past experience and statistical laws
. Posterior Probability: Reasons based on results (parameters given for data)
Likelihood: Data given for parameters

Guess you like

Origin www.cnblogs.com/woodyh5/p/12732278.html