Linear&Logistic Neuron

multi-layer neuralnetworks do not use the perceptron learning procedure.

Because this type ofguarantee cannot be extended to more complex networks in which the average oftwo good solutions may be a bad solution.

 

Linear neurons(linearfilters(线性滤波))



The error surface for a  linear neuron

horizontalaxis: weights

verticalaxis: error

 

 

 

Logistic Neuron


dy/dz = y(1-y)



Back propagation algorithm




Ways to reduce overfitting

  • weight-decay
  • weight-sharing
  • early-stopping
  • model-averaging
  • bayesian fitting of neural nets
  • dropout

 

The squared error asa function of the weights is not a quadratic function when we use a logisticunit.

 


猜你喜欢

转载自blog.csdn.net/ll523587181/article/details/78841588