tensorflow activation function graph display

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt

x = np.linspace(-10,10,100)

"""

The role of the activation function is to solve the nonlinear problem, the two states of the Rubik's cube human brain neurons, namely the activation state and the inhibition state,

Convert part of the data to a suppressed state through an activation function

 

"""

 

"""
Advantages: Its output is mapped in (0,1), monotonically continuous, very suitable for use as an output layer, and the derivation is relatively easy;
Disadvantages: It has soft saturation, once the input falls into the saturation region, the first derivative is When it becomes close to 0, it is easy to produce gradient disappearance.
Saturation: when |x|>c, where c is a constant, the first derivative is equal to 0. In layman's terms, the first derivative is the slope in the above figure, The function is getting more and more horizontal.
"""


y1 = tf.nn.sigmoid(x)

y2 = tf.nn.tanh(x)

y3 = tf.nn.elu(x)

y4 = tf.nn.softplus(x)
y5 = tf.nn.softsign(x)

"""

relu is hard saturated when x<0. Since the first derivative is 1 when x>0. Therefore, the relu function can keep the gradient from decaying when x>0,
thereby alleviating the problem of gradient disappearance, and it can also converge faster. However, as training progresses,
some of the inputs will fall into the hard saturation region, causing the corresponding weights to fail to update. We call this "neuron death." """


y6 = tf.nn.relu(x)
y7 = tf.nn.relu6(x)
y8 = tf.nn.relu6(x)

#Display of different activation functions
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
ax1 = plt.subplot2grid((4,2), (0,0))
ax1.plot(x, sess. run(y1))
ax1.set_title("sigmoid")

ax2 = plt.subplot2grid((4,2), (0,1))
ax2.plot(x, sess.run(y2))
ax2.set_title("tanh")

ax3 = plt.subplot2grid((4,2), (1,0))
ax3.plot(x, sess.run(y3))
ax3.set_title("elu")

ax4 = plt.subplot2grid((4,2), (1,1))
ax4.plot(x, sess.run(y4))
ax4.set_title("softplus")

ax5 = plt.subplot2grid((4,2), (2,0))
ax5.plot(x, sess.run(y5))
ax5.set_title("softsign")

ax6 = plt.subplot2grid((4,2), (2,1))
ax6.plot(x, sess.run(y6))
ax6.set_title("relu")

ax7 = plt.subplot2grid((4,2), (3,0))
ax7.plot(x, sess.run(y7))
ax7.set_title("relu6")

ax8 = plt.subplot2grid((4,2), (3,1))
ax8.plot(x, sess.run(y8))
ax8.set_title("leakrelu")

plt.show()

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325124735&siteId=291194637