吴裕雄--天生自然TensorFlow2教程:激活函数及其梯度

 

 

 

 

import tensorflow as tf

a = tf.linspace(-10., 10., 10)
a
with tf.GradientTape() as tape:
    tape.watch(a)
    y = tf.sigmoid(a)
grads = tape.gradient(y, [a])
grads

 

 

 

a = tf.linspace(-5.,5.,10)
a
tf.tanh(a)

 

 

a = tf.linspace(-1.,1.,10)
a
tf.nn.relu(a)
tf.nn.leaky_relu(a)

猜你喜欢

转载自www.cnblogs.com/tszr/p/12228118.html