keras框架的MLP手写数字识别MNIST,梳理?

keras框架的MLP手写数字识别MNIST

In [1]:
import numpy as np
import pandas as pd

from keras.utils import np_utils
np.random.seed(10)
 
Using TensorFlow backend.
In [2]:
from keras.datasets import mnist
In [3]:
(x_train_image,y_train_label),\
(x_test_image,y_test_label) = mnist.load_data()
In [4]:
import matplotlib.pyplot as plt
def plot_image(image):
    fig = plt.gcf()
    fig.set_size_inches(1,1)
    plt.imshow(image,cmap='binary')
    plt.show()
In [5]:
plot_image(x_train_image[0])
y_train_label[0]
 
Out[5]:
5
In [6]:
def plot_image_labels_prediction(images,labels,prediction,idx,num=10):
    fig = plt.gcf()
    fig.set_size_inches(12,24)
    if num>50 : num = 50
    for i in range(0,num):
        ax = plt.subplot(10,5,1+i)
        ax.imshow(images[idx],cmap='binary')
        title = "lable="+str(labels[idx])
        if len(prediction)>0:
            title+=",predict="+str(prediction[idx])
        ax.set_title(title,fontsize=10)
        ax.set_xticks([]);ax.set_yticks([])
        idx+=1
    plt.show()                                  
In [7]:
plot_image_labels_prediction(x_train_image,y_train_label,[],0,10)
 
In [8]:
x_train = x_train_image.reshape(60000,784).astype('float32')
x_test = x_test_image.reshape(10000,784).astype('float32')
In [9]:
x_train_normalize = x_train/255
x_test_normalize = x_test/255
In [10]:
y_train_oneHot = np_utils.to_categorical(y_train_label)
y_test_oneHot = np_utils.to_categorical(y_test_label)
In [11]:
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
In [12]:
model = Sequential()
In [13]:
model.add(Dense(units = 1000, 
                input_dim=784,
               kernel_initializer = 'normal',
               activation = 'relu'))
In [14]:
model.add(Dropout(0.5))
In [15]:
model.add(Dense(units = 10, 
               kernel_initializer = 'normal',
               activation = 'sigmoid'))
In [16]:
print(model.summary())
 
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 1000)              785000    
_________________________________________________________________
dropout_1 (Dropout)          (None, 1000)              0         
_________________________________________________________________
dense_2 (Dense)              (None, 10)                10010     
=================================================================
Total params: 795,010
Trainable params: 795,010
Non-trainable params: 0
_________________________________________________________________
None
In [17]:
model.compile(loss='categorical_crossentropy',
             optimizer='adam',metrics=['accuracy'])
In [18]:
train_history = model.fit(x=x_train_normalize,
                         y=y_train_oneHot,
                         validation_split = 0.2,
                         epochs = 10,
                         batch_size = 200,
                         verbose = 2)
 
Train on 48000 samples, validate on 12000 samples
Epoch 1/10
 - 6s - loss: 0.4048 - acc: 0.8816 - val_loss: 0.1653 - val_acc: 0.9549
Epoch 2/10
 - 6s - loss: 0.1715 - acc: 0.9490 - val_loss: 0.1204 - val_acc: 0.9645
Epoch 3/10
 - 5s - loss: 0.1246 - acc: 0.9624 - val_loss: 0.1008 - val_acc: 0.9704
Epoch 4/10
 - 5s - loss: 0.0993 - acc: 0.9701 - val_loss: 0.0922 - val_acc: 0.9723
Epoch 5/10
 - 6s - loss: 0.0820 - acc: 0.9755 - val_loss: 0.0843 - val_acc: 0.9753
Epoch 6/10
 - 6s - loss: 0.0680 - acc: 0.9786 - val_loss: 0.0787 - val_acc: 0.9766
Epoch 7/10
 - 6s - loss: 0.0598 - acc: 0.9818 - val_loss: 0.0778 - val_acc: 0.9772
Epoch 8/10
 - 5s - loss: 0.0545 - acc: 0.9829 - val_loss: 0.0710 - val_acc: 0.9790
Epoch 9/10
 - 5s - loss: 0.0480 - acc: 0.9851 - val_loss: 0.0708 - val_acc: 0.9791
Epoch 10/10
 - 6s - loss: 0.0410 - acc: 0.9874 - val_loss: 0.0680 - val_acc: 0.9796
In [19]:
def show_train_history(train_history,train,validation):
    plt.plot(train_history.history[train])
    plt.plot(train_history.history[validation])
    plt.title('Train_History')
    plt.ylabel(train)
    plt.xlabel('Epoch')
    plt.legend(['train','validation'], loc = 'upper left')
    plt.show()
In [20]:
show_train_history(train_history,'acc','val_acc')
 
In [21]:
score = model.evaluate(x_test_normalize,y_test_oneHot)
print()
print('accurary=',score[1])
 
10000/10000 [==============================] - 1s 72us/step

accurary= 0.9806
In [22]:
prediction = model.predict_classes(x_test_normalize)
In [23]:
prediction
Out[23]:
array([7, 2, 1, ..., 4, 5, 6], dtype=int64)
In [24]:
plot_image_labels_prediction(x_test_image,
                             y_test_label,
                             prediction,
                             0,
                            50)
 
In [25]:
pd.crosstab(y_test_label,
            prediction,
           rownames=['label'],
           colnames=['predict'])
Out[25]:
 
predict 0 1 2 3 4 5 6 7 8 9
label                    
0 973 0 1 1 0 0 1 1 3 0
1 0 1128 3 0 0 0 1 0 3 0
2 3 1 1013 1 3 0 2 5 4 0
3 2 0 1 992 0 3 0 6 3 3
4 0 0 4 0 967 0 2 1 2 6
5 3 0 0 11 1 863 6 1 6 1
6 7 2 0 1 6 4 933 0 5 0
7 1 3 8 1 1 0 0 1006 2 6
8 4 0 1 3 5 2 1 2 953 3
9 3 3 0 2 14 2 0 4 3 978
In [ ]:
 
In [ ]:
 
 

MLP多层感知器Multi-Layer Perceptron

啊,好像没什么好说的,就这样吧。以后再看看。

啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊


啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊啊

猜你喜欢

转载自www.cnblogs.com/bai2018/p/10355557.html