Likelihood, likelihood, and then seems!

Likelihood, likelihood, and then seems!

1.1 Likelihood clam?

In statistics, the likelihood and probability but two different things.
 Probability: at a known model, the likelihood of an event occurring.
 Likelihood: In an unknown model, a series of events, the outcome of these events to estimate the unknown parameters of the model (condition).
 To take a trivial example might be more image. Now I have a standard coin, I'll throw it in the air, then I want to know is the possibility that it landed right side up is how much? There is no doubt, because the coin is the standard, the front and the probability of tails is fifty - fifty ~ ~ ~ ~ (Lu master warning) 50%. The likelihood is that now I get the coins may not be the standard, there may be positive than negative weight is also possible that the opposite situation, I will toss this coin a thousand times, 700 times that found face up, 300 time was tails. So I can not estimate the coin coin standard, non-standard While the probability of a random coin toss face-up (parameter) is estimated at 0.7 (standard coin
parameter is 0.5).
Overall: It was in a known probability model for national salvation may occur are described. The likelihood model is described by the result obtained results have been occurred.

1.2 likelihood function is clam?

  Probability density function: P ( x i ) P ( x|\theta )
 likelihood function: L ( i x ) L ( \theta | x )
  x and i \theta When correspond with each other, the values of the two functions are equal. But the meaning expressed by two functions are completely different. The likelihood function is given the parameter, x is the possibility of occurrence of known results.
  The same or coins described as an example. I joined the coin toss five times, resulting in an exact result that is positive and negative positive positive and negative, depending on the nature of this we now assume that the coin thrown positive probability i \theta (i.e., the value of the relevant parameter set coin i \theta ), it appears that the results (positive and negative positive positive and negative) is the possibility of i \theta i \theta (1- i \theta ) i \theta (1- i \theta ), the possibility that the value is the value of the likelihood function.

1.3 likelihood function to the maximum likelihood estimation

  Examples of the coin or the continuation of the above. In theory I'd like to know an unknown parameter of the coin, I have to throw countless times, but my time is limited, I just wanted to throw five parameters of the coin estimate. According to the likelihood function that is the value we get i 3 ( 1 i ) 2 \theta ^ { 3 } ( 1 - \theta) ^ { 2 } , with the i \theta different values which are different function values. when i \theta = 0.2, the likelihood function is 0.00512, when i \theta 0.3 = time, a function of the value 0.01323. Obviously the latter larger than the former value of the likelihood function, the parameters of the coin if it is compared with the parameter is 0.3 0.2, more likely we are aware of the results of five experiments, it is estimated that more reliable parameter of 0.3.

 There is clearly a i \theta = 0.6 so that the function value takes a maximum value, i.e., when the coin of this parameter is estimated to be i \theta possibility = 0.6 (face up), there being positive and negative maximum reverse this situation. So the best estimate based on these five experiments, the parameters which we can make the coin is 0.6. Of course, five experimental data derived estimate is difficult to convince people, so as fifty thousand times experimentally derived estimate is closer to the real parameters.
  Maximum likelihood estimation is to use the data you have on hand to get the result, so the overall parameters to make one of the most reliable estimate.

1.4 Maximum Likelihood Estimation of general routines:

According to the probability distribution, the survey results to write the likelihood function. Of the logarithm of the likelihood function (multiplicative increase becomes even). Derivation, the first derivative is zero. Solving equations. Making conclusions.

Published 53 original articles · won praise 13 · views 9163

Guess you like

Origin blog.csdn.net/weixin_43770577/article/details/104064438