Depth hands-on science learning three --- MLP MLP

1, hidden layer

MLP is introduced into a plurality of hidden layers in a single layer neural network, hidden layer is located between the input layer and output layer

Wherein the number of input layer 4, category label output layer 3, five hidden units, the input layer does not involve calculations, the number of layers of the multilayer perceptron 2

Hidden layer neurons and input layer fully connected respective inputs, and an output layer neuron hidden layer neuron fully connected, so the whole connection layer two: the hidden layer and output layer

Suppose hidden layer weights and bias are W H , B H , the output layer weights and bias are W o , B o , the output o is calculated as

H = XWh + bh,

O = HW o  + b o

Ji, O = (XW  + b h h ) W o  + b o = XW W h o b h w o + b o

In this case, the neural network is still equivalent to a single layer, the output layer weights are W is H W is O , deviation B H W is O + B O

Question: model is still linear model, how to solve nonlinear problems?

Solution: introducing a nonlinear transformation --- activation function, solve nonlinear problems

2, activation function

(1) ReLU function

Given element x, ReLU (x) = max (x, 0)

(2) sigmod function

 Converting the value of the element is between 0 and 1, the neural network used in the early more commonly, is now more commonly ReLU

(3) tanh function

hyperbolic tangent function tanh element values ​​may be converted between 1 and 1

3, Multilayer Perceptron

Multilayer Perceptron neural network is a full-containing connection layer (hidden layer) composed of at least one hidden layer, wherein the output of each hidden layer will be converted by the activation function

Layers and layers of hidden layer MLP is super parameters, you can set their own

It represents the activation function

 

Guess you like

Origin www.cnblogs.com/slfh/p/10897126.html
MLP