Python realizes neural network activation function

Activation function

The activation function must be a non-linear function. The problem with the linear function is that no matter how deep the number of layers is, it is equivalent to a neural network without hidden layers.
For example: linear function h(x) = cx as the activation function,
y(x) = h(h(h(x))) three-layer neural network, simplified to y(x)=c c c*x, unable to play much The advantages brought by the layer network.

1) Step function:

# coding: utf-8
import numpy as np
import matplotlib.pylab as plt

def step_function(x):
    return np.array(x > 0, dtype=np.int)

X = np.arange(-5.0, 5.0, 0.1)
Y = step_function(X)
plt.plot(X, Y)
plt.ylim(-0.1, 1.1)  # 指定图中绘制的y轴的范围
plt.show()

The graph is as follows
Insert picture description here

step_function 实现分解
>>> import numpy as np
>>> x = np.array([-1.0,1.0,2.0])
>>> x
array([-1.,  1.,  2.])
>>> y=x>0
>>> y
array([False,  True,  True])
>>> y=y.astype(np.int)
>>> y
array([0, 1, 1])

2) sigmoid function:
sigmoid is a smooth curve, and the output changes continuously with the input. The step function is bounded by 0, and the output changes drastically. The smoothness of the function is of great significance to the learning of neural networks.
Another difference is that compared to the step function can only return 0 or 1, the sigmoid function can return real numbers such as 0.731..., 0.880.
Sigmoid can be compared to a waterwheel, which adjusts the amount of water sent out according to the amount of water flowing.

# coding: utf-8
import numpy as np
import matplotlib.pylab as plt

def sigmoid(x):
    return 1 / (1 + np.exp(-x))    

X = np.arange(-5.0, 5.0, 0.1)
Y = sigmoid(X)
plt.plot(X, Y)
plt.ylim(-0.1, 1.1)
plt.show()

The graph is as follows
Insert picture description here
3) Relu function
Insert picture description here

# coding: utf-8
import numpy as np
import matplotlib.pylab as plt


def relu(x):
    return np.maximum(0, x)

x = np.arange(-5.0, 5.0, 0.1)
y = relu(x)
plt.plot(x, y)
plt.ylim(-1.0, 5.5)
plt.show()

The graph is as follows
Insert picture description here

Guess you like

Origin blog.csdn.net/WANGYONGZIXUE/article/details/110290300