How to popularly understand maximum likelihood estimation

I bought a can of eight-treasure porridge last night

I searched for a longan in it for a long time. Generally, a can of eight-treasure porridge has a longan in it.

We can now infer the ratio of raw materials when the manufacturer is producing by counting the number of various raw materials in this can of eight-treasure porridge.

The theoretical basis here is the maximum likelihood estimation

It seems to mean like this

The maximum likelihood estimation, in a popular understanding, is to use the known sample result information to infer the model parameter values ​​(the ratio of raw materials of the manufacturer) that are most likely (maximum probability) to cause these sample results (the eight-treasure porridge in my hand) to appear. !

In other words, Maximum Likelihood Estimation provides a way to evaluate the parameters of a model given observational data, ie: "The model is given, the parameters are unknown".

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324525002&siteId=291194637