Probability statistics·Parameter estimation [moment estimation, maximum likelihood estimation, unbiasedness, validity, consistency]

point estimate

Assume that the form of the distribution function of the population is known, but one or more of its parameters are unknown. The problem of estimating the values ​​of the unknown parameters of the population with the help of a sample of the population is called a point estimation problem.

moment estimate

Insert image description here
Insert image description here
It would be easier to understand this by looking at examples.

example

Insert image description here
Insert image description here

  1. First μ 1 =E(x), μ 2 =E(x 2 ) 有几个未知参数就列几次方的期望,但考试应该最多二次(under normal circumstances, this situation may only be tested in the exam)
  2. Next, express the unknown parameters as μ 1 and μ 2
  3. Then, μ 1 and μ 2 are the population 1st moment and the population 2nd moment, which are replaced by the sample 1st moment and the sample 2nd moment (A 1 , A 2 ) 即直接把未知参数中的μ替换成A,并且未知参数头上再带个破折号. Sample 1st moment - sample mean, sample 2nd moment - sample 1st moment = (according to the review on the picture) sample deviation

If the question asks about estimating quantities, convert lowercase letters to uppercase letters.

When the probability distribution is unknown, use moment estimation

maximum likelihood estimation

I won’t let go of theoretical knowledge. It would be better to just look at examples
Insert image description here
. If you find that you can’t find the maximum when deriving the derivative (partial derivative), use the maximum likelihood principle
. Discrete type - joint distribution law, continuous type - joint probability density

example

Insert image description here
Insert image description here
Insert image description here

  1. First find the joint...function (likelihood function)
  2. Then take the logarithm of the joint function
  3. Then derive it (if there are multiple unknown parameters requiring partial derivatives)

Insert image description here
This type of question cannot be derived to obtain the maximum value,
so it needs to be deduced using the above likelihood principle.

If the probability distribution is known, use likelihood estimation

impartiality

Insert image description here

No matter what distribution the population obeys, as long as the expectation and variance exist:
the sample mean is an unbiased estimate of the population mean, and the sample variance is an unbiased estimate of the population variance.

effectiveness

Insert image description here

Consistency

Insert image description here

practise

moment estimate

Insert image description here
Insert image description here
This question mainly applies what we learned above, but the expectations here are not clear at a glance. We need to use previous knowledge to calculate the expectations.

Estimator - uppercase letters, estimate - lowercase letters
Moment estimate - a symbol above the letter

disorder

Insert image description here
Insert image description here
Be familiar with the formulas: Variance and mean about sample averages

discrete likelihood estimation

Insert image description here
Insert image description here
The sample values ​​x 1 =1, x 2 =2 and so on are - X 1 =1, X 2 =2, so the likelihood function is like that

Guess you like

Origin blog.csdn.net/qq_61786525/article/details/128158700