Loss difference between function, cost function, the objective function of machine learning and Contact

To assess the model fit is good or bad, usually with loss of function (equivalent to that strictly speaking the following objective function) to measure the degree of fit. Loss Function Minimization, meaning the best goodness of fit, is the optimal model parameters corresponding to the parameters.

Each algorithm has an objective function ( in Objective function), the algorithm is to make the objective function is optimal. For the classification algorithm, there will be right or wrong. Wrong will come at a price, or a loss. Target classification algorithm is to make it wrong at least, it is the minimum cost.

Loss function called an error function (an error between the prediction value and the true value), used to measure the operation of the algorithm. Loss function applies only to measure the performance of the algorithm in a single training sample. It is mainly used with back-propagation, so that the minimum can be found in the broadcast reverse, the loss function must be mediated.

But we need to measure the performance of the algorithm on all the training samples, which we need to define a cost function (also known as cost function), the cost function is m a loss function sum then divided by m.

The cost function plus the regular items or whatever term is called optimization objective function.

Guess you like

Origin www.cnblogs.com/dyl222/p/11019996.html