tensorflow学习笔记(3) 初识MNIST手写数据集


资料来源:《TensorFlow实战Google深度学习框架》   郑泽宇 梁博文 顾思宇 著 (第二版) 电子工业出版社


一、概念

    MNIST手写数据集是著名的,入门的样例数据集。这个数据集里边有很多图片,每张图片都代表了0-9这十个数字中的任何一个,且数字都会出现在图片的正中央。每一张图片都是28*28大小的矩阵,这个矩阵中的每个元素的值都是0-1之间的小数,0代表完全的白色,1代表完全的黑色,那么0-1中间的数值,越靠近1就越黑。虽然图片是28*28的矩阵,但实际上在MNIST中存储的是长度为28*18=784的一维数组,因为一维数组更方便处理。

    MNIST包含了三部分图片:

     training data (训练集):55000张

     validating data(验证集):5000张

     testing data(测试集): 10000张

input_data_read_data_sets()函数会自动将MNIST数据集划分为这三个集合。

为了方便使用随机梯度下降,input_data_read_data_sets()函数生成的类还提供了mnist.train.next_batch(),它可以从所有的训练数据中读取一小部分作为一个训练batch

二、代码展示


from tensorflow.examples.tutorials.mnist import input_data

#载入MNIST数据集,如果指定地址/path/to/MNIST_data下没有已经下载好的数据集
#那么Tensorflow会自动从某固定的网址中下载
mnist = input_data.read_data_sets("./path/to/MNIST_data/",one_hot=True)

#打印Traning data size: 55000
print ("Traning data size:",mnist.train.num_examples)

#打印Validating data size: 5000
print("Validating data size:",mnist.validation.num_examples)

#打印Testing data size: 10000
print("Testig data size:",mnist.test.num_examples)

#打印Exaple training data:[0.0.0.  ...  0.380   0.376  ...  0.]
print("Example training data:",mnist.train.images[0])

#打印Example training data label:
#[0 . 0.  0.  0. 0. 0. 0. 1. 0. 0.]
print("Examples training data lable:",mnist.train.labels[0])

batch_size = 100
#xs相当于opencv中的rows的概念, ys 相当于opencv中的cols
xs,ys = mnist.train.next_batch(batch_size)
#从train的集合中选取batch_size个训练数据
#输出x shape:(100,784)
print("X Shape :",xs.shape)
#输出y.shape:(100,10)
print("Y Shape :",ys.shape)

三、代码运行结果

runfile('/Users/mac126/MNIST_TEST.py', wdir='/Users/mac126')
Extracting ./path/to/MNIST_data/train-images-idx3-ubyte.gz
Extracting ./path/to/MNIST_data/train-labels-idx1-ubyte.gz
Extracting ./path/to/MNIST_data/t10k-images-idx3-ubyte.gz
Extracting ./path/to/MNIST_data/t10k-labels-idx1-ubyte.gz
Traning data size: 55000
Validating data size: 5000
Testig data size: 10000
Example training data: [ 0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.38039219  0.37647063
  0.3019608   0.46274513  0.2392157   0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.35294119  0.5411765
  0.92156869  0.92156869  0.92156869  0.92156869  0.92156869  0.92156869
  0.98431379  0.98431379  0.97254908  0.99607849  0.96078438  0.92156869
  0.74509805  0.08235294  0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.54901963  0.98431379  0.99607849  0.99607849  0.99607849  0.99607849
  0.99607849  0.99607849  0.99607849  0.99607849  0.99607849  0.99607849
  0.99607849  0.99607849  0.99607849  0.99607849  0.74117649  0.09019608
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.88627458  0.99607849  0.81568635
  0.78039223  0.78039223  0.78039223  0.78039223  0.54509807  0.2392157
  0.2392157   0.2392157   0.2392157   0.2392157   0.50196081  0.8705883
  0.99607849  0.99607849  0.74117649  0.08235294  0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.14901961  0.32156864  0.0509804   0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.13333334  0.83529419  0.99607849  0.99607849  0.45098042  0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.32941177  0.99607849  0.99607849  0.91764712  0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.32941177  0.99607849  0.99607849  0.91764712  0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.41568631  0.6156863   0.99607849  0.99607849  0.95294124  0.20000002
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.09803922  0.45882356  0.89411771
  0.89411771  0.89411771  0.99215692  0.99607849  0.99607849  0.99607849
  0.99607849  0.94117653  0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.26666668  0.4666667   0.86274517
  0.99607849  0.99607849  0.99607849  0.99607849  0.99607849  0.99607849
  0.99607849  0.99607849  0.99607849  0.55686277  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.14509805  0.73333335  0.99215692
  0.99607849  0.99607849  0.99607849  0.87450987  0.80784321  0.80784321
  0.29411766  0.26666668  0.84313732  0.99607849  0.99607849  0.45882356
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.44313729
  0.8588236   0.99607849  0.94901967  0.89019614  0.45098042  0.34901962
  0.12156864  0.          0.          0.          0.          0.7843138
  0.99607849  0.9450981   0.16078432  0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.66274512  0.99607849  0.6901961   0.24313727  0.          0.
  0.          0.          0.          0.          0.          0.18823531
  0.90588242  0.99607849  0.91764712  0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.07058824  0.48627454  0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.32941177  0.99607849  0.99607849  0.65098041  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.54509807  0.99607849  0.9333334   0.22352943  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.82352948  0.98039222  0.99607849  0.65882355  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.94901967  0.99607849  0.93725497  0.22352943  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.34901962  0.98431379  0.9450981   0.33725491  0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
  0.01960784  0.80784321  0.96470594  0.6156863   0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.01568628  0.45882356  0.27058825  0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.        ]
Examples training data lable: [ 0.  0.  0.  0.  0.  0.  0.  1.  0.  0.]
X Shape : (100, 784)
Y Shape : (100, 10)

四、代码分析

1.若发现报错

    根据《TensorFlow实战Google深度学习框架》这本书上所示的代码,我发现报错了,如下图所示:

于是我将书中代码所示的

mnist = input_data.read_data_sets("/path/to/MNIST_data/",one_hot=True)

改为了:

  mnist = input_data.read_data_sets("./path/to/MNIST_data/",one_hot=True)

或  mnist = input_data.read_data_sets("",one_hot=True)都行

 只要是指定地址下没有已经下载好的数据,tensorflow会自动从固定的网址下载数据集。

2.images和label

mnist.train.images[0]表示数据集的第0张图片

mnist.train.lables[0]表示第0张图片所表示的数字。

3.xs,ys的shape值的含义

xs.shape为什么是(100,784),ys.shape为什么是(100,10)?

因为100表示的是batch_size,6=784表示一维数组的长度,10表示label(一共有0-9这十个数字)

意思就是说,哟,现在啊,咱们从MNIST数据集里面只取出来100个样本,也就是100张照片,同时呢,我们还给每张照片配上答案(每张照片的答案就是这张照片所显示的数字,答案是0-9数字中的一个)

猜你喜欢

转载自blog.csdn.net/Johnisjohn/article/details/88619604