Deep learning from entry to give up within 3 --- Data Section (queue save and read)

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/LEEANG121/article/details/102559400

Deep learning from entry to give up within 3 --- Data Section (save and read data)

Save data

After a good neural network training, which weights need to be heavy, offset value saved, so the next time you use neural networks do not have the training they can be loaded directly parameters, the following is a data preservation and extraction-based program (here Note that, save data only to save weight and variable, do not save frame) preservation of the data is stored weights can be easily calculated once the next model:

import tensorflow as tf
Weights = tf.Variable([[1,2,3],[3,4,5]],dtype=tf.float32,name="Weights")
baises = tf.Variable([[1,32,3]],dtype=tf.float32,name="baises")
saver = tf.train.Saver()#保存数据和提取数据都需要先创建一个对象,才能使用
init = tf.global_variables_initializer()
with tf.Session() as sess:
    sess.run(init)
    save_path = saver.save(sess, "C:/Users/52566/.jupyter/saver_1")#这里斜杠的方向不能反,否则会报错
save_path = saver.save(sess, "C:/Users/52566/.jupyter/saver_1")
save_path: C:/Users/52566/.jupyter/saver_1这是存储路径

Here Insert Picture Description

Read data

Tensorflow support a variety of ways to read the data, the easiest way is to read data input and reading of constants, using the placeholder (I will explain the function in the other notes, this is actually a placeholder). However, this approach requires manual data read type of data transfer array. After the feed function is automatically constructed using the iterator within data iteration.

This embodiment describes the note by reading the data in the form of a queue. This data reading method saves a lot of redundant operations, and read data queues need only deal with, and without the need to read the data mode and the type of underlying data to deal with. Whereby data can be avoided and some preprocessing work takes a lot of time.

By FIFOQeue function, first create an ordered queue of the FIFO, there is mainly required for the data input order neural network model, such as timing analysis. A method of creating additional RandomShuffleQueue queues is a function for reading and outputting the random sample books.

Here Insert Picture Description

to sum up

Data queue reads as follows:
1, read from the disk name and path data;
2, the file names are sequentially inserted into the tail;
3, reads the file name from the head of the queue and the read data;
. 4, the read Decoder data decoding taken;
5, the input data samples for subsequent processing queue is used.

Guess you like

Origin blog.csdn.net/LEEANG121/article/details/102559400