tensorflow Dataset and TFRecord some points [continuously updated]

About tensorflow Dataset combined with TFRecord this regard see very good article:

https://cloud.tencent.com/developer/article/1088751

github:

https://github.com/YJango/TFRecord-Dataset-Estimator-API/blob/master/TensorFlow%20Dataset%20%2B%20TFRecords.ipynb

 

dataset points:

Usually the first shuffle, then the batch, and then repeat

Of course, you can then repeat the first batch, and to do so in front of a difference is there until the last batch is part of the batch of data had appeared in the test set to do so with caution.

 

dataset的one_shot_iterator和make_initializable_iterator

See a question on stackoverflow:

https://stackoverflow.com/questions/48091693/tensorflow-dataset-api-diff-between-make-initializable-iterator-and-make-one-sho

Actually, I think the main question to ask is not the answer, says the Lord meant, in fact, only one-shot iterator iteration round, initializable iterator can iterate through multiple rounds (implemented by sess.run (iterator.initializer). So this iterator difference between the two is very obvious, in other words, the main problem of the second code segment, except the first epoch 0, all remaining epoch are no data error.

You can see the following code to test:

import tensorflow as tf
import numpy as np
dataset = tf.data.Dataset.from_tensor_slices(np.random.uniform(size=(5, 2))).shuffle(100).batch(2)
iterator = dataset.make_initializable_iterator()
# iterator = dataset.make_one_shot_iterator()
one_element = iterator.get_next()
with tf.Session() as sess:

    for i in range(5):
        sess.run(iterator.initializer)
        while True:
            try:
                print(sess.run(one_element))
            except tf.errors.OutOfRangeError:
                print("Epoch %s is done." % i)
                break

  

 

Guess you like

Origin www.cnblogs.com/zhouxiaosong/p/12080760.html
Recommended