python绘制神经网络中的ReLU激活函数图像(附代码)

版权声明:转载请注明出处 http://blog.csdn.net/yezhongdefeng https://blog.csdn.net/YeZhongDeFeng/article/details/78274307
上一篇,绘制了Sigmoid与Tanh激活函数的图像,今天分享一下ReLU激活函数的代码,代码整体结构与上一篇相似,只是把函数换了一下:
import math
import matplotlib.pyplot as plt
import numpy as np
import matplotlib as mpl
mpl.rcParams['axes.unicode_minus']=False

fig = plt.figure(figsize=(6,4))
ax = fig.add_subplot(111)

x = np.arange(-10, 10)
y = np.where(x<0,0,x)

plt.xlim(-11,11)
plt.ylim(-11,11)

ax.spines['top'].set_color('none')
ax.spines['right'].set_color('none')

ax.xaxis.set_ticks_position('bottom')
ax.spines['bottom'].set_position(('data',0))
ax.set_xticks([-10,-5,0,5,10])
ax.yaxis.set_ticks_position('left')
ax.spines['left'].set_position(('data',0))
ax.set_yticks([-10,-5,5,10])

plt.plot(x,y,label="ReLU",color = "blue")
plt.legend()
plt.show()

当然Relu函数不用这么复杂,只需要手动就可以画出,在这里给出方法,只是希望能在画别的函数的时候得到启发。结果如下:


猜你喜欢

转载自blog.csdn.net/YeZhongDeFeng/article/details/78274307