Naive Bayes Algorithm Study Notes
This article is only for personal study and understanding
Naive Bayes classifier (is a classification method)
Bayes formula
P ( A ∣ B ) = P ( A , B ) P ( B ) = P ( B ∣ A ) P ( A ) P ( B ) P(A \mid B)=\frac{P(A,B) }{P(B)}=\frac{P(B \mid A) P(A)}{P(B)} P(A∣B)=P(B)P(A,B)=P(B)P(B∣A)P(A)
where:
P ( A ) P(A)P ( A ) :prior probability
P ( A ∣ B ) P(A \mid B)P(A∣B ) :posterior probability
P ( B ∣ A ) P(B \mid A)P(B∣A ) : The probability that B occurs under the condition that event A occurs, that is,the likelihood function
P ( B ) P(B)P ( B ) : the same for all class labels, so the evidence factorP ( B ) P(B)P(B)与类标记无关
P ( A i ∣ B ) = P ( B ∣ A i ) P ( A i ) ∑ j P ( B ∣ A j ) P ( A j ) P\left(A_{i} \mid B\right)=\frac{P\left(B \mid A_{i}\right) P\left(A_{i}\right)}{\sum_{j} P\left(B \mid A_{j}\right) P\left(A_{j}\right)} P(Ai∣B)=∑jP(B∣Aj)P(Aj)P(B∣Ai)P(Ai)
classification problem
Naive Bayesian Algorithm (with Example Explanation)
The Bayesian formula can be expressed as:
P ( y = cn ∣ x = X ) = P ( x = X , y = cn ) P ( x = X ) = P ( x = X ∣ y = cn ) P ( y = cn ) P ( x = X ) P\left(y=c_{n} \mid x=X\right)=\frac{P\left(x=X, y =c_{n}\right)}{P(x=X)}=\frac{P\left(x=X \mid y=c_{n}\right) P\left(y=c_{n}\ right)}{P(x=X)}P(y=cn∣x=X)=P(x=X)P(x=X,y=cn)=P(x=X)P(x=X∣y=cn)P(y=cn)
Assuming that each attribute is independent of each other:
P ( x = X ∣ y = cn ) = ∏ i = 1 m P ( xi = ai ∣ y = cn ) P\left(x=X \mid y=c_{n}\right )=\prod_{i=1}^{m} P\left(x^{i}=a_{i} \mid y=c_{n}\right)P(x=X∣y=cn)=i=1∏mP(xi=ai∣y=cn)
Naive Bayesian formula: in the feature setxxUnder the condition of x , yyy取不同值的概率。
P ( y = c n ∣ x = X ) = P ( y = c n ) ∏ i = 1 m P ( x i = a i ∣ y = c n ) P ( x = X ) P\left(y=c_{n} \mid x=X \right)=\frac{P\left(y=c_{n}\right) \prod_{i=1}^{m} P\left(x^{i}=a_{i} \mid y=c_{n}\right)}{P(x=X)} P(y=cn∣x=X)=P(x=X)P(y=cn)∏i=1mP(xi=ai∣y=cn)
will maximize the conditional probability yyy作为预测结果:
f ( x ) = argmax ( P ( y = c n ) ∏ c n m P ( x i = a i ∣ y = c n ) P ( x = X ) ) f(x)=\operatorname{argmax}\left(\frac{P\left(y=c_{n}\right) \prod_{c_{n}}^{m} P\left(x^{i}=a_{i} \mid y=c_{n}\right)}{P(x=X)}\right) f(x)=argmax(P(x=X)P(y=cn)∏cnmP(xi=ai∣y=cn))
(the evidence factor is independent of the class label) that is,p ( x = X ) p(x=X)p(x=X)可以省略
f ( x ) = arg max c n ( P ( y = c n ) ∏ i = 1 m P ( x i = a i ∣ y = c n ) ) f(x)=\arg \max _{c_{n}}\left(P\left(y=c_{n}\right) \prod_{i=1}^{m} P\left(x^{i}=a_{i} \mid y=c_{n}\right)\right) f(x)=argcnmax(P(y=cn)i=1∏mP(xi=ai∣y=cn))
P ( y = c n ) = ∑ i = 1 N I ( y = c n ) N , n = 1 , 2 , … K P\left(y=c_{n}\right)=\frac{\sum_{i=1}^{N} I\left(y=c_{n}\right)}{N}, n=1,2, \ldots K P(y=cn)=N∑i=1NI(y=cn),n=1,2,…K
P ( x i = a j ∣ y = c n ) = ∑ i = 1 N I ( x i j = a j ∣ y = c n ) ∑ i = 1 N I ( y i = c n ) P\left(x^{i}=a_{j} \mid y=c_{n}\right)=\frac{\sum_{i=1}^{N} I\left(x_{i}^{j}=a_{j} \mid y=c_{n}\right)}{\sum_{i=1}^{N} I\left(y_{i}=c_{n}\right)} P(xi=aj∣y=cn)=∑i=1NI(yi=cn)∑i=1NI(xij=aj∣y=cn)
Calculation example 1
Code example implementation of Naive Bayes algorithm (python)
price A | Lesson B | Sales C | price A | Lesson B | Sales C |
---|---|---|---|---|---|
Low | many | high | 0 | 2 | 2 |
high | middle | high | 2 | 1 | 2 |
Low | few | high | 0 | 0 | 2 |
Low | middle | Low | 0 | 1 | 0 |
middle | middle | middle | 1 | 1 | 1 |
high | many | high | 2 | 2 | 2 |
Low | few | middle | 0 | 0 | 1 |
Predicted price A=2 (high) class hours B=2 (more) hours of sales
from __future__ import division
from numpy import array
def set_data(price, time, sale):
price_number = []
time_number = []
sale_number = []
for i in price:
if i == "低":
price_number.append(0)
elif i == "中":
price_number.append(1)
elif i == "高":
price_number.append(2)
for j in time:
if j == "少":
time_number.append(0)
elif j == "中":
time_number.append(1)
elif j == "多":
time_number.append(2)
for k in sale:
if k == "低":
sale_number.append(0)
elif k == "中":
sale_number.append(1)
elif k == "高":
sale_number.append(2)
return price_number, time_number, sale_number
price = ["低", "高", "低", "低", "中", "高", "低"]
time = ["多", "中", "少", "中", "中", "多", "少"]
sale = ["高", "高", "高", "低", "中", "高", "中"]
price_number, time_number, sale_number = set_data(price, time, sale)
print(price_number, time_number, sale_number)
P ( C = 0 ∣ x = X ) ∝ P ( C = 0 ) P ( A = 2 ∣ C = 0 ) P ( B = 2 ∣ C = 0 ) P(C=0\mid x=X) \propto P(C=0)P(A=2 \mid C=0)P(B=2 \mid C=0)P(C=0∣x=X)∝P(C=0)P(A=2∣C=0)P(B=2∣C=0 )
P ( C = 1 ∣ x = X ) ∝ P ( C = 1 ) P ( A = 2 ∣ C = 1 ) P ( B = 2 ∣ C = 1 ) P(C=1\mid x=X) \proto P(C=1)P(A=2 \mid C=1)P(B=2 \mid C=1)P(C=1∣x=X)∝P(C=1)P(A=2∣C=1)P(B=2∣C=1 )
P ( C = 2 ∣ x = X ) ∝ P ( C = 2 ) P ( A = 2 ∣ C = 2 ) P ( B = 2 ∣ C = 2 ) P(C=2\mid x=X) \proto P(C=2)P(A=2 \mid C=2)P(B=2 \mid C=2)P(C=2∣x=X)∝P(C=2)P(A=2∣C=2)P(B=2∣C=2)
from __future__ import division
price_number = [0, 2, 0, 0, 1, 2, 0]
time_number = [2, 1, 0, 1, 1, 2, 0]
sale_number = [2, 2, 2, 0, 1, 2, 1]
exprice_number = 2
extime_number = 2
sale0p = sale_number.count(0)
sale1p = sale_number.count(1)
sale2p = sale_number.count(2)
a0 = 0
a1 = 0
a2 = 0
b0 = 0
b1 = 0
b2 = 0
for i in range(0, len(sale_number)):
if price_number[i] == 2:
if sale_number[i] == 0:
a0 = a0 + 1
elif sale_number[i] == 1:
a1 = a1 + 1
elif sale_number[i] == 2:
a2 = a2 + 1
if time_number[i] == 2:
if sale_number[i] == 0:
b0 = b0 + 1
elif sale_number[i] == 1:
b1 = b1 + 1
elif sale_number[i] == 2:
b2 = b2 + 1
pa0 = a0 / sale0p
pa1 = a1 / sale1p
pa2 = a2 / sale2p
pb0 = b0 / sale0p
pb1 = b1 / sale1p
pb2 = b2 / sale2p
pc0 = sale0p / len(sale_number)
pc1 = sale1p / len(sale_number)
pc2 = sale2p / len(sale_number)
pcc0 = pc0 * pa0 * pb0
pcc1 = pc1 * pa1 * pb1
pcc2 = pc2 * pa2 * pb2
indf = (pcc0, pcc1, pcc2)
print(indf)
max_indf = indf.index(max(indf))
if max_indf == 0:
print('销量低')
elif max_indf == 1:
print('销量中')
elif max_indf == 2:
print('销量高')
quoted blog post
Simple example of Naive Bayesian classification algorithm
Naive Bayesian algorithm (with example explanation)
Code implementation of Naive Bayesian algorithm (python)