MNIST- self-learning neural network: python

Question: How digital identification "5"? O (∩_∩) O ~

                                 Handwritten numerals "5" example: written vary, variety

Program: from the image feature amount is extracted ----- learning techniques and by learning the feature quantity of the model

Indicators learning neural network used is called the loss function .

Loss function may be used as many functions, the most famous is the mean square error (mean squared error).

  Mean square error expression:

    

    Y K   ----------- neural network output

    T K   ----------- supervision data, One-Hot represented

  Here neural network output y is the output softmax function . softmax output function can be interpreted as probability .

  Python mean squared error achieved by:   

def mean_squared_error(y-t):
    return 0.5*np.sum((y-t)**2)

In addition to the mean square error, the cross-entropy error (cross entropy error) are also often used as loss function.
  Cross entropy error of expression:

    

    For example, assuming the correct solution of tabs of index is "2", with the output of the corresponding neural network is 0.6, then the cross entropy error is -log 0.6 = 0.51; if "2" is output corresponds to 0.1, then the cross entropy error is -log 0.1 = 2.30. That is, the value of the cross-entropy error is the correct solution label output corresponding decision .

 

 

The image shown in FIG natural logarithm:

. 1  Import matplotlib.pyplot AS PLT
 2  Import numpy AS NP
 . 3  
. 4  # generated data 
. 5 X = np.arange (0.01,1.01,0.01 )
 . 6 Y = np.log (X)
 . 7  
. 8  # drawn image 
. 9  plt.plot (X , Y)
 10 plt.xlabel ( ' X ' )
 . 11 plt.ylabel ( ' Y ' )
 12 is plt.show ()

Cross-entropy error code for:

def cross_entropy_error(y,t):
    delta=1e-7
    return -np.sum(t*np.log(y+delta))

 

Guess you like

Origin www.cnblogs.com/taoyuxin/p/11441692.html
Recommended