TensorFlow 之 Eager execution basics

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/Feynman1999/article/details/84666436

Eager execution basics

官方的一句话解释:

Eager execution is a feature that makes TensorFlow execute operations immediately: concrete values are returned, instead of creating a computational graph that is executed later.

具体来说:

Eager Execution 是一个灵活的机器学习平台,用于研究和实验,可提供:

  • 直观的界面 - 自然地组织代码结构并使用 Python 数据结构。快速迭代小模型和小型数据集。
  • 更轻松的调试功能 - 直接调用操作以检查正在运行的模型并测试更改。使用标准 Python 调试工具进行即时错误报告。
  • 自然控制流程 - 使用 Python 控制流程而不是图控制流程,简化了动态模型的规范。

一些要点和接口(具体可以看完整代码)

  • The most obvious differences between NumPy arrays and TensorFlow Tensors are

    • Tensors can be backed by accelerator memory (like GPU, TPU).
    • Tensors are immutable.
  • Conversion between TensorFlow Tensors and NumPy ndarrays is quite simple as:

    • TensorFlow operations automatically convert NumPy ndarrays to Tensors.
    • NumPy operations automatically convert Tensors to NumPy ndarrays.
  • The .numpy() method explicitly converts a Tensor to a numpy array

  • uniform分布

x=tf.random_uniform([3,3])

  • 看当前变量(是否)在哪个内存中x.device.endswith('GPU:0')

  • Create a source dataset using one of the factory functions like Dataset.from_tensors, Dataset.from_tensor_slices or using objects that read from files like TextLineDataset or TFRecordDataset. See the TensorFlow Guide for more information.

    ds_tensors = tf.data.Dataset.from_tensor_slices([1,2,3,4,5,6])返回一个TensorSliceDataset类的对象

  • ds_file = tf.data.TextLineDataset(filename)

  • Apply transformations

    ds_tensors = ds_tensors.map(tf.square).shuffle(2).batch(2)

    ds_file = ds_file.batch(2)

  • 可直接迭代 ,do not need to explicitly create an tf.data.Iterator object

print('Elements of ds_tensors:')
for x in ds_tensors:
    print(x,end='\n')

print('\nElements in ds_file:')
for x in ds_file:
    print(x)

Code1

import tensorflow as tf 
import numpy as np
import timeit

tf.enable_eager_execution()

# Tensor objects have a data type and a shape
# TensorFlow offers a rich library of operations 
# (tf.add, tf.matmul, tf.linalg.inv  etc.) that consume and produce Tensors.

print(tf.add(1,2))
print(tf.add([1,2],[3,4]))
print(tf.square(5))
print(tf.reduce_sum([1,2,3]))
print(tf.encode_base64("hello world"))
print(tf.square(2)+tf.square(3))

x = tf.matmul([[1]], [[2, 3]])
print(x.shape)
print(x.dtype)

#The most obvious differences between NumPy arrays and TensorFlow Tensors are
# 1.Tensors can be backed by accelerator memory (like GPU, TPU).
# 2.Tensors are immutable.

'''
Conversion between TensorFlow Tensors and NumPy ndarrays is quite simple as:
TensorFlow operations automatically convert NumPy ndarrays to Tensors.
NumPy operations automatically convert Tensors to NumPy ndarrays.
'''
ndarray = np.ones((3,3))
print("TensorFlow operations convert numpy arrays to Tensors automatically")
tensor = tf.multiply(ndarray,42)
print(tensor)

print("And NumPy operations convert Tensors to numpy arrays automatically")
print(np.add(tensor, 1))

print("The .numpy() method explicitly converts a Tensor to a numpy array")
print(tensor.numpy())

x=tf.random_uniform([3,3])
print("Is there a GPU available: "),
print(tf.test.is_gpu_available())
print("Is the Tensor on GPU #0:  "),
print(x.device.endswith('GPU:0'))

test="""
x = tf.random_uniform([1000,1000])
tf.matmul(x,x)
"""

setup="""import tensorflow as tf 
from __main__ import x
"""

#Force execution on CPU
print("On CPU:")
with tf.device("CPU:0"): #指定cpu
    assert x.device.endswith("CPU:0")
    print(timeit.timeit(stmt=test, setup=setup,number=100))

关于tempfile库的使用可参见我的另一篇博客(https://blog.csdn.net/Feynman1999/article/details/84666275)

Code2

import tensorflow as tf
import tempfile
'''
Datasets

 You can use Python iteration over the tf.data.Dataset object 
 and do not need to explicitly create an tf.data.Iterator object.
'''

tf.enable_eager_execution()
ds_tensors = tf.data.Dataset.from_tensor_slices([1,2,3,4,5,6])

#create a CSV file

_ , filename = tempfile.mkstemp()

with open(filename, 'w') as f:
    f.write("""
Line 2
Line 3
    """)

ds_file = tf.data.TextLineDataset(filename)

# Apply transformations

ds_tensors = ds_tensors.map(tf.square).shuffle(2).batch(2)
ds_file = ds_file.batch(2)

print('Elements of ds_tensors:')
for x in ds_tensors:
    print(x,end='\n')

print('\nElements in ds_file:')
for x in ds_file:
    print(x)

猜你喜欢

转载自blog.csdn.net/Feynman1999/article/details/84666436
今日推荐