用python实现求信息增益,进行特征选择。

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/fei13971414170/article/details/80009985

使用python语言,实现求特征选择的信息增益,可以同时满足特征中有连续型和离散型属性的情况。

##师兄让我做一个特征选择的代码,我在网上找了一下,大部分都是用来求离散型属性的信息益益,但是我的数据是同时包含离散型和连续型属性的,所以这里实现了一下。

代码块

import numpy as np
import math

class IG():
def init(self,X,y):

    X = np.array(X)
    n_feature = np.shape(X)[1]
    n_y = len(y)

    orig_H = 0
    for i in set(y):
        orig_H += -(y.count(i)/n_y)*math.log(y.count(i)/n_y)

    condi_H_list = []
    for i in range(n_feature):
        feature = X[:,i]
        sourted_feature = sorted(feature)
        threshold = [(sourted_feature[inde-1]+sourted_feature[inde])/2 for inde in range(len(feature)) if inde != 0 ]

        if max(feature) in threshold:
            threshold.remove(max(feature))
        if min(feature) in threshold:
          threshold.remove(min(feature))

        pre_H = 0
        for thre in set(threshold):
            lower = [y[s] for s in range(len(feature)) if feature[s] < thre]
            highter = [y[s] for s in range(len(feature)) if feature[s] > thre]
            H_l = 0
            for l in set(lower):
                H_l += -(lower.count(l) / len(lower))*math.log(lower.count(l) / len(lower))
            H_h = 0
            for h in set(highter):
                H_h += -(highter.count(h) / len(highter))*math.log(highter.count(h) / len(highter))
            temp_condi_H = len(lower)/n_y *H_l+ len(highter)/n_y * H_h
            condi_H = orig_H - temp_condi_H
            pre_H = max(pre_H,condi_H)
        condi_H_list.append(pre_H)

    self.IG = condi_H_list


def getIG(self):
    return self.IG

使用方法示例:
if name == “main”:
X = [[1,0,0,1],
[0,1,1,1],
[0,0,1,0]]
y = [0,0,1]
print(IG(X,y).getIG())

猜你喜欢

转载自blog.csdn.net/fei13971414170/article/details/80009985
今日推荐