Tensorflow+Keras Deep Learning Artificial Intelligence Practical Application Chapter Two Deep Learning Principles

2.1 Principles of nerve conduction

y=activation(x*w+b)

Activation functions are usually nonlinear functions Sigmoid function and ReLU function

2.2 Simulate neural network with matrix operation

y=activation(x*w+b)

output = activation function (input * weight + bias)

2.3 Multilayer Perceptron Model

1 Recognition of minst handwritten digit images with a multilayer perceptron model

The data of the input layer is a 28*28 two-dimensional image converted to a 1-dimensional vector by reshape as a shuru of 784 neurons

The input layer has 784 input neurons to receive external signals

The hidden layer simulates internal neurons with a total of 256 hidden neurons

The 10 output neurons in the output layer are the predicted results

There are 10 results corresponding to the numbers 0-9 we want to predict

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324781629&siteId=291194637