table of Contents
Chapter 2: Induction Machine (Artificial Neuron)
2.5 Realization of multi-layer perceptron (solving exclusive OR gate)
Chapter 2: Induction Machine (Artificial Neuron)
Perceptron as the algorithm of neural network origin
2. Perceptron
Concept: The perceptron is actually a question of whether to flow or not. Flow means 1 but not flow means 0, 0 corresponds to "no signal transmission", and 1 corresponds to "transmit signal"
x1, x2 are input, y is output, w1, w2 are weights, y will be activated only when the sum of x*w exceeds the threshold θ
2. Simple logic circuit
2.1 Simple logic circuit
Here, the logic circuit of AND gate, NAND gate, or gate is relatively simple, which is to activate the neuron according to the AND NOR, and then combine the formula of the above perceptron to sum up whether the sum exceeds the threshold.
2.2 Code implementation:
And gate realization: others are similar
2.3 Import weights and biases
If according to the above formula, θ is a negative number, we can change it to -b, and get
The role of bias and weight is different. Weight is a parameter that controls the importance of the input signal , and bias is a parameter that adjusts the ease of activation of a neuron (the degree to which the output signal is 1)
numpy code implementation:
Note: This part of the weight is calculated based on experience
2.4 Perceptron limitations
The space divided by the curve is called the nonlinear space , and the space divided by the straight line is called the linear space
Perceptrons can represent AND, OR and NAND gates. They are all linear, but XOR is not linear. It cannot be represented by a single layer of perceptrons, but can be represented by multiple perceptrons .
2.5 Realization of multi-layer perceptron (solving exclusive OR gate)
In the study of combing logic-this multiple structure solves the XOR gate
Code:
The logic is as follows:
Assume that the exclusive OR gate is a neural network with a multilayer structure . Here, the leftmost column is called layer 0 , the middle column is called layer 1, and the rightmost column is called layer 2.
Perceptrons can perform non-linear representations through superimposed layers , and theoretically can also represent computer processing
2.6 Summary
Is the mathematical content of textbooks, the most important is to understand the relationship between Perceptron neural networks