keras深度学习入门

目标:完成对Iris数据集分类

Iris是4维数据,所以第一层的neuron的数目时4;定义两个hidden-layer,每层有10个neuron;因为要对Iris数据分成3类,所以最后一层的neuron数目为3,且activation函数为softmax,将输出结果normalized到0-1之间。

重要函数:

keras.utils.to_categorical(),将数据转换成one-hot类型,比如data_label=[0,1,4],函数会根据label中最大的数字来定义one-hot data的维度,那么上面的label对应的one-hot数据为:

[[ 1.  0.  0.  0.  0.] 

[ 0.  1.  0.  0.  0.]

[ 0.  0.  0.  0.  1.]]

fit函数中的train是神经网络的输入数据,label是神经网络的输出数据,都是对应到每个neuron的,所以这个问题的train是个(none,4)类型的数据,输出是(none,3)类型的数据,输入输出每条一一对应。fit()中的batch_size,是随机抽样训练的规模,在gpu加速之后可以并行计算,且提高了accuracy。

代码:

# -*- coding: utf-8 -*-
"""
Created on Fri Sep 15 20:00:39 2017

@author: wjw
"""
import keras
import numpy as np
from keras.layers.core import Dense, Activation 

def readText(filePath):
    lines = open(filePath,'r').readlines()
    data = []
    dataClass = []
    for line in lines:
        dataList = line.split(',')
        data.append([float(dataList[0]),float(dataList[1]),float(dataList[2]),float(dataList[3])])
        dataClass.append(dataList[4].split("\n")[0])
   
    new_class = []
    
    for name in dataClass:
        if name=="Iris-setosa":
            new_class.append(0)
        elif name=="Iris-versicolor":
            new_class.append(1)
        else:
            new_class.append(2)
            
    return np.array(data),np.array(new_class)

model = keras.models.Sequential()#初始化一个神经网络
model.add(Dense(input_dim=4,output_dim=10))#Dense表示fully connected 的神经网络
model.add(Activation("sigmoid"))
model.add(Dense(output_dim=10))
model.add(Activation("sigmoid"))
model.add(Dense(output_dim=3))
model.add(Activation("softmax"))
model.compile(loss="categorical_crossentropy",optimizer="adam",metrics=["accuracy"])
#"categorical crossentropy"按照crossenrropy的方法定义损失函数
#优化方法是"adam"
filePath = r"E:\data\iris.txt"
traindata, dataClass = readText(filePath)
dataClass = keras.utils.to_categorical(dataClass)
print(dataClass)

model.fit(traindata,dataClass,batch_size=20,epochs=2000)
score=model.evaluate(traindata,dataClass,batch_size=20)
print(score)

部分训练结果:

Epoch 1/200
150/150 [==============================] - 0s - loss: 1.2763 - acc: 0.3333     
Epoch 2/200
150/150 [==============================] - 0s - loss: 1.2426 - acc: 0.3333     
Epoch 3/200
150/150 [==============================] - 0s - loss: 1.2117 - acc: 0.3333     
Epoch 4/200
150/150 [==============================] - 0s - loss: 1.1872 - acc: 0.3333     
Epoch 5/200
150/150 [==============================] - 0s - loss: 1.1667 - acc: 0.3333     
Epoch 6/200
150/150 [==============================] - 0s - loss: 1.1497 - acc: 0.3333     
Epoch 7/200
150/150 [==============================] - 0s - loss: 1.1342 - acc: 0.3333     
Epoch 8/200
150/150 [==============================] - 0s - loss: 1.1214 - acc: 0.3333     
Epoch 9/200
150/150 [==============================] - 0s - loss: 1.1107 - acc: 0.3333     
Epoch 10/200
150/150 [==============================] - 0s - loss: 1.1015 - acc: 0.3333     
Epoch 11/200
150/150 [==============================] - 0s - loss: 1.0942 - acc: 0.3333     
Epoch 12/200
150/150 [==============================] - 0s - loss: 1.0873 - acc: 0.2467     
Epoch 13/200
150/150 [==============================] - 0s - loss: 1.0816 - acc: 0.1867     
Epoch 14/200
150/150 [==============================] - 0s - loss: 1.0747 - acc: 0.3267     
Epoch 15/200
150/150 [==============================] - 0s - loss: 1.0693 - acc: 0.3200     
Epoch 16/200
150/150 [==============================] - 0s - loss: 1.0637 - acc: 0.3933     
Epoch 17/200
150/150 [==============================] - 0s - loss: 1.0583 - acc: 0.3933     
Epoch 18/200
150/150 [==============================] - 0s - loss: 1.0524 - acc: 0.4733     
Epoch 19/200
150/150 [==============================] - 0s - loss: 1.0461 - acc: 0.6533     
Epoch 20/200
150/150 [==============================] - 0s - loss: 1.0393 - acc: 0.8000     
Epoch 21/200
150/150 [==============================] - 0s - loss: 1.0318 - acc: 0.7333     
Epoch 22/200
150/150 [==============================] - 0s - loss: 1.0248 - acc: 0.7200     
Epoch 23/200
150/150 [==============================] - 0s - loss: 1.0165 - acc: 0.7133     
Epoch 24/200
150/150 [==============================] - 0s - loss: 1.0087 - acc: 0.7067     
Epoch 25/200
150/150 [==============================] - 0s - loss: 1.0005 - acc: 0.6933     
Epoch 26/200
150/150 [==============================] - 0s - loss: 0.9926 - acc: 0.7067     
Epoch 27/200
150/150 [==============================] - 0s - loss: 0.9839 - acc: 0.7467     
Epoch 28/200
150/150 [==============================] - 0s - loss: 0.9752 - acc: 0.7467     
Epoch 29/200
150/150 [==============================] - 0s - loss: 0.9666 - acc: 0.7333     
Epoch 30/200
150/150 [==============================] - 0s - loss: 0.9581 - acc: 0.7400     
Epoch 31/200
150/150 [==============================] - 0s - loss: 0.9488 - acc: 0.7267     
Epoch 32/200
150/150 [==============================] - 0s - loss: 0.9398 - acc: 0.7200     
Epoch 33/200
150/150 [==============================] - 0s - loss: 0.9305 - acc: 0.7467     




猜你喜欢

转载自blog.csdn.net/ge_nious/article/details/78005736