An example of building an artificial neural network based on Keras

An example of building an artificial neural network based on Keras

Artificial Intelligence (AI) has become a popular direction, and build a neural network is the most important part, this introduction a build entry-level examples of artificial neural network based Keras, (backend Tensorflow)
If Python is not If you know it well, you can spend two hours looking at this python tutorial
https://pan.baidu.com/s/139i6QEMFdBuG7EmK8SwhFg
extraction code: 176t
on the code:

import numpy as np
from keras.models import Sequential
from keras.layers import Dense
import matplotlib.pyplot as plt
np.random.seed(1377)

~ This section is mainly to import some libraries. If not installed, Baidu installation method
~ numpy is a very fast math library, mainly used for array calculation
~ keras is the protagonist, a simple and powerful neural network to build the framework, backend There are Tensorflow and Theano and CNTK
~ matplotlib is mainly a library for drawing
~ random.seed is used to generate random numbers

#creat some data
X1=np.linspace(-1,1,200)
np.random.shuffle(X1)
Y=0.5*X1+2+np.random.normal(0,0.05,(200,))
#plot
plt.scatter(X1,Y)
plt.show()

~ This paragraph is used to generate the data set for this experiment. X1 takes 200 numbers between -1 and 1 and shuffles the order
~ Then establish a relationship between Y and X1
~ random.normal is a normal distribution function, The mean is 0, the standard deviation is 0.05, the output shape is (200,), the result is Y≈0.5 * X1 + 2
~ the final drawing is as follows
200 data sets

X1_train,Y_train=X1[:160],Y[:160]
X1_test,Y_test=X1[160:],Y[160:]

~ The first 80% of the resulting data set will be included in the training set, and the latter 20% will be included in the test set (usually 8,2 points)
~ The following will start to build a neural network:

model=Sequential()
model.add(Dense(output_dim=1,input_dim=1))
model.compile(loss="mean_squared_error",optimizer='sgd')

~ The model uses Sequential sequential structure, one layer connects one layer
~ Add a layer through model.add, here add a Dense (fully connected layer), output_dim (single output number) = 1, input_dim (single input number Number) = 1
~ model.compile to define other parameters, loss (loss function) = mse (measure the loss value with the mean square error), optimizer (optimizer) = 'sgd' (random gradient descent optimizer)
~ so far, one The simplest artificial neural network is built, the next step is to train the network

print("Traing..........")
for i in range(301):
    loss=model.train_on_batch(X1_train,Y_train)
    if i%50==0:
        print('train loss:',loss)

~ When training, we tell the input and output to the network (Supervised learing)
~ Train the network through model.train_on_batch, he will return the loss value of the
training ~ Training 301 times, print the 'train loss value every 50 times

print('\nTesting........')
loss=model.evaluate(X1_test,Y_test,batch_size=160)
print('test loss:',loss)
W,b=model.layers[0].get_weights()
print('W:',W,'\nb:',b)

~ This section is the test part, evaluate the performance of the trained network in the test set (evaluate)
~ Because we only built one layer (input and output are not counted), the actual establishment is a Y = W * The function of x + b, the closer W is to 0.5 after training, and the closer b is to 2, indicating better effect

Y_pred=model.predict(X1_test)
plt.scatter(X1_test,Y_test)
plt.plot(X1_test,Y_pred)
plt.show()

~ This paragraph is to predict the Y value of the output through the X1 test set (this is what we often care about), to see if it is close to Y_test (the test loss above also has the same effect)
~ Let ’s take a look at the training effect

~ W and b are close to 0.5 and 2
Insert picture description here
~ ordinate linearly predicted output network, the ordinate represents the real point discrete output
- to an entry to which the instance, please leave a message if wrong
~ trouble from Mo

Published 2 original articles · praised 1 · visits 45

Guess you like

Origin blog.csdn.net/qq_45074963/article/details/105317462