[Study Notes] Introduction to Deep Learning: Theory and Implementation Based on Python-Python Introduction and Perceptron

1. Introduction to Python

1.1 NumPy

In the implementation of deep learning, the calculation of arrays and matrices often occurs. NumPyMany convenience methods are provided in the array class (numpy.array) of , which we will use when implementing deep learning. For details on how to install various third-party libraries, see: Tutorial on installing third-party modules related to Python machine learning and data analysis in VS Code .

Import NumPy:

import numpy as np

Generate NumPy arrays:

x = np.array([1.0, 2.0, 3.0])
print(x)  # [1. 2. 3.]

NumPy array arithmetic operations:

x = np.array([1.0, 2.0, 3.0])
y = np.array([2.0, 4.0, 6.0])
x + y  # [3. 6. 9.]
x - y  # [-1. -2. -3.]
x * y  # [2. 8. 18.]
x / y  # [0.5 0.5 0.5]
x / 2.0  # [0.5 1. 1.5],广播功能

Multidimensional NumPy arrays:

A = np.array([[1, 2], [3, 4]])
A.shape  # (2, 2),查看A的形状
A.dtype  # dtype('int64'),查看矩阵元素的数据类型

The broadcast function is shown in the figure below:

insert image description here
insert image description here

Ways to access matrix elements:

X = np.array([[51, 55], [14, 19], [0, 4]])
X[0]  # array([51, 55]),第0行
X[0][1]  # 55,(0, 1)位置的元素
X = X.flatten()  # 将X转换为一维数组
print(X)  # [51 55 14 19 0 4]
X[np.array([0, 2, 4])]  # array([51, 14, 0]),获取索引为0、2、4的元素

from XXExtraction in X is greater than 15 151 of 5 elements:

X > 15  # array([True, True, False, True, False, False], dtype=bool)
X[X > 15]  # array([51, 55, 19])

1.2 Matplotlib

In deep learning experiments, graph drawing and data visualization are very important. MatplotlibIt is a library for drawing graphics. Matplotlib can be used to easily draw graphics and realize data visualization.

draw sin sins i n function image:

import numpy as np
import matplotlib.pyplot as plt

# 生成数据
x = np.arange(0, 6, 0.1)  # 以0.1为步长,生成[0, 6)的数据
y = np.sin(x)

# 绘制图形
plt.plot(x, y)
plt.show()

The drawing result is shown in the figure below:

insert image description here

add cos cosc o s function, and add title andxxOther functions such as x- axis label names:

import numpy as np
import matplotlib.pyplot as plt

# 生成数据
x = np.arange(0, 6, 0.1)  # 以0.1为步长,生成[0, 6)的数据
y1 = np.sin(x)
y2 = np.cos(x)

# 绘制图形
plt.plot(x, y1, label="sin")
plt.plot(x, y2, linestyle="--", label="cos")  # 用虚线绘制
plt.xlabel("x")  # x轴标签
plt.ylabel("y")  # y轴标签
plt.title('sin & cos')  # 标题
plt.legend()
plt.show()

The drawing result is shown in the figure below:

insert image description here

pyplotMethods for displaying images are also provided in imshow(). Alternatively, images can be read in using matplotlib.imagethe module's methods:imread()

import matplotlib.pyplot as plt
from matplotlib.image import imread

img = imread('lena.png')  # 读入图像(设定合适的路径!这里假定图像lena.png在当前目录下)
plt.imshow(img)
plt.show()

After running the above code, the image shown below will be displayed:

insert image description here

2. Perceptron

2.1 What is a perceptron

A perceptron receives multiple input signals and outputs a single signal. The figure below is an example of a perceptron that receives two input signals, x 1 , x 2 x_1,x_2x1,x2is the input signal, yyy is the output signal,w 1 , w 2 w_1,w_2w1,w2is the weight ( www w e i g h t weight w e i g h t ). The circles in the diagram are called "neurons" or "nodes". When the input signal is sent to the neuron, it will be multiplied by fixed weights (w 1 x 1 , w 2 x 2 w_1x_1,w_2x_2w1x1,w2x2). The neuron will calculate the sum of the transmitted signals, and only when the sum exceeds a certain threshold value, it will output 1 11 . This is also known as "the neuron is activated". This boundary value is calledthe threshold, with the symbolθ \thetaθ represents.

insert image description here

The above content is expressed in a mathematical formula as shown in the following formula:

insert image description here

The multiple input signals to a perceptron have inherent weights that act to control the importance of each signal. That is, the larger the weight, the higher the importance of the signal corresponding to that weight.

2.2 Simple logic circuit

Now consider using a perceptron to implement AND gate AND\ gateA N D g a t e  , its truth table is shown in the figure below:

insert image description here

There are countless ways to select parameters that meet the conditions in the above figure. When ( w 1 , w 2 , θ ) = ( 0.5 , 0.5 , 0.7 ) (w_1,w_2,\theta )=(0.5,0.5,0.7)(w1,w2,i )=(0.5,0.5,0 . 7 ) , the above conditions can be met. After setting such parameters, only whenx 1 x_1x1and x 2 x_2x2Simultaneously 1 11 , the weighted sum of the signals will exceed the given thresholdθ \thetai .

2.3 Realization of Perceptron

Use Python to realize the above logic circuit:

def AND(x1, x2):
	w1, w2, theta = 0.5, 0.5, 0.7
	tmp = x1 * w1 + x2 * w2
	if tmp <= theta:
		return 0
	elif tmp > theta:
		return 1

We put the θ \theta in the previous mathematical formulaθ is replaced by− b -bb , the perceptron can be represented by the following formula:

insert image description here

Here, bbb is calledbias,w 1 w_1w1Sum w 2 w_2w2called weight . The perception machine calculates the product of the input signal and the weight, and then adds a bias, if this value is greater than 0 00 then output1 11 , otherwise output0 00

Using weights and biases, an AND gate can be implemented like this:

def AND(x1, x2):
	x = np.array([x1, x2])
	w = np.array([0.5, 0.5])
	b = -0.7
	tmp = np.sum(w * x) + b
	if tmp <= 0:
		return 0
	else:
		return 1

Note that the biases and weights w 1 , w 2 w_1,w_2w1,w2function is different. Specifically, w 1 w_1w1Sum w 2 w_2w2is the parameter that controls the importance of the input signal, and the bias is to adjust how easily the neuron is activated (the output signal is 1 11 degree) parameters.

2.4 Limitations of Perceptron

X O R   g a t e XOR\ gate The truth table of X O R g a t e  is shown in the figure below:

insert image description here

This XOR gate cannot be realized with the perceptron introduced earlier. Take the OR gate as an example, when the weight parameter ( b , w 1 , w 2 ) = ( − 0.5 , 1.0 , 1.0 ) (b,w_1,w_2)=(-0.5,1.0,1.0)(b,w1,w2)=(0.5,1.0,1 . 0 ) , the truth table condition can be satisfied, and the perceptron can be expressed by the following formula:

insert image description here

At this time, the perception opportunity is generated by the straight line − 0.5 + x 1 + x 2 = 0 -0.5+x_1+x_2=00.5+x1+x2=0 to separate the two spaces. One of the spaces outputs1 11 , another space outputs0 00 , as shown in the figure below:

insert image description here

OR gate at ( x 1 , x 2 ) = ( 0 , 0 ) (x_1,x_2)=(0,0)(x1,x2)=(0,0 ) output0 00 , at( x 1 , x 2 ) (x_1,x_2)(x1,x2) is( 0 , 1 ) , ( 1 , 0 ) , ( 1 , 1 ) (0,1),(1,0),(1,1)(0,1),(1,0),(1,1 ) output1 11 . In the figure above, the circle represents0 00 , the triangle means1 11

The output of the XOR gate is shown in the figure below:

insert image description here

Trying to separate the circles and triangles in the image above with a straight line is impossible anyway.

The above conditions can be achieved by using the nonlinear space divided by the curve :

insert image description here

2.5 Multilayer Perceptron

The cool thing about the perceptron is that it can "stack layers". XOR gates can be implemented by combining AND gates, NAND gates and OR gates, as shown in the following figure:

insert image description here

Assuming that all three gates have been implemented, the code to implement the XOR gate in Python is as follows:

def XOR(x1, x2):
	s1 = NAND(x1, x2)
	s2 = OR(x1, x2)
	y = AND(s1, s2)
	return y

Next, we try to express this XOR gate with the representation of a perceptron (explicitly displaying neurons). As shown in the figure below, the XOR gate is a multi-layer neural network. Here, the leftmost column is referred to as 0 0Layer 0 , the middle column is called 111 layer, the rightmost column is called 2nd22 floors.

insert image description here

Next section: [Study Notes] Introduction to Deep Learning: Theory and Implementation Based on Python - Neural Network .

Guess you like

Origin blog.csdn.net/m0_51755720/article/details/128128915