google machine learning notes (b)

Simplifying regularization Regularization for Simplicity

For the sake of simplicity of regularization.
Regularization: reduce the complexity of the model to reduce the over-fitting method.

First understand the regularization of meaning: do not place undue reliance on the principles according to the sample.
Training set has a unique disturbance and particularity . (Language learning by analogy when fully learn if a person's manner of speaking, you will learn a lot from inadvertently opening addiction)
regularization methods are:
1, in time to stop (difficult, not easy to grasp)
2, on the complexity of the model to punish ( that the model does not limitation complicates) using L2 regularization
Generalization curve
L2 having positive specifically regularization:
· complexity (model) = the weight square sum
reduce very large weight
-for linear probability, the preferred relatively gentle slope
· Bayer Adams first probability: 0 weight should be centered and normal distribution
Here Insert Picture Description
Here Insert Picture Description
L: to reduce the training error
lambda: a regular rate, is how to control weight balance right scalar value, the model used to describe the complexity of
balancing complexity: deta
up and down two times: the weight of standardized square

lambda * regularization term = lambda * complexity (Model) to adjust the regularization overall impact of items of
high lambda value of the model is simple
lambda value is lower then the model complexity, lambda = 0 is completely canceled regularization - for the sole purpose> training only minimizes loss -> while the highest risk of over-fitting

Regularization applicable:
not more training data, the training data or test data are different, it may need to use cross-validation test set or adjusted individually.
Goal: structure, risk minimization.
Here Insert Picture Description

Released five original articles · won praise 2 · Views 164

Guess you like

Origin blog.csdn.net/DUTwangtaiyu/article/details/104669954