TensorFlow2.0 Tutorial - automatic derivation

TensorFlow2.0 Tutorial - automatic derivation

Wen is willing to address: https://doit-space.blog.csdn.net/article/details/95040901

The most complete Tensorflow 2.0 Getting Started tutorial continuously updated: https://blog.csdn.net/qq_31456593/article/details/88606284

See the complete tensorflow2.0 tutorial code https://github.com/czy36mengfei/tensorflow2_tutorials_chinese (welcome star)

This tutorial focuses on learning by individuals reproduce notes tensorflow2.0 official tutorial from finishing, Chinese to explain, easy to enjoy reading tutorials Chinese friends, the official tutorial: https://www.tensorflow.org

This section we will introduce the use tensorflow2 method for automatic derivation.

一、Gradient tapes

tensorflow tf.GradientTape api to provide automatic derivation function. As long as tf.GradientTape () operation performed in the context, it will be recorded with the "tape", and then to calculate the gradient tensorflow related operations using reverse automatic differentiation.

x = tf.ones((2,2))

# 需要计算梯度的操作
with tf.GradientTape() as t:
    t.watch(x)
    y = tf.reduce_sum(x)
    z = tf.multiply(y,y)
# 计算z关于x的梯度
dz_dx = t.gradient(z, x)
print(dz_dx)
tf.Tensor(
[[8. 8.]
 [8. 8.]], shape=(2, 2), dtype=float32)

May output the derivative of the intermediate variables

# 梯度求导只能每个tape一次
with tf.GradientTape() as t:
    t.watch(x)
    y = tf.reduce_sum(x)
    z = tf.multiply(y,y)
    
dz_dy = t.gradient(z, y)
print(dz_dy)
tf.Tensor(8.0, shape=(), dtype=float32)

GradientTape By default, resources are () was released after the execution tf.GradientTape. If you want to repeatedly compute the gradient, we need to create a lasting GradientTape.

with tf.GradientTape(persistent=True) as t:
    t.watch(x)
    y = tf.reduce_sum(x)
    z = tf.multiply(y, y)
    
dz_dx = t.gradient(z,x)
print(dz_dx)
dz_dy = t.gradient(z, y)
print(dz_dy)
tf.Tensor(
[[8. 8.]
 [8. 8.]], shape=(2, 2), dtype=float32)
tf.Tensor(8.0, shape=(), dtype=float32)

Second, the recording control flow

Because the tapes recorded, the entire operation, even if the python flow control (e.g., if, while) present process, the gradient derivative can normally handle.

def f(x, y):
    output = 1.0
    # 根据y的循环
    for i in range(y):
        # 根据每一项进行判断
        if i> 1 and i<5:
            output = tf.multiply(output, x)
    return output

def grad(x, y):
    with tf.GradientTape() as t:
        t.watch(x)
        out = f(x, y)
        # 返回梯度
        return t.gradient(out, x)
# x为固定值
x = tf.convert_to_tensor(2.0)

print(grad(x, 6))
print(grad(x, 5))
print(grad(x, 4))
tf.Tensor(12.0, shape=(), dtype=float32)
tf.Tensor(12.0, shape=(), dtype=float32)
tf.Tensor(4.0, shape=(), dtype=float32)

Third, higher-order gradient

GradientTape context manager while calculating the gradient of the gradient will remain so GradientTape higher order gradient calculation can be realized,

x = tf.Variable(1.0)

with tf.GradientTape() as t1:
    with tf.GradientTape() as t2:
        y = x * x * x
    dy_dx = t2.gradient(y, x)
    print(dy_dx)
d2y_d2x = t1.gradient(dy_dx, x)
print(d2y_d2x)
tf.Tensor(3.0, shape=(), dtype=float32)
tf.Tensor(6.0, shape=(), dtype=float32)
发布了143 篇原创文章 · 获赞 345 · 访问量 47万+

Guess you like

Origin blog.csdn.net/qq_31456593/article/details/95040901