[10] maximum likelihood estimator of nature

[10] maximum likelihood estimator of nature

( Theorem 1 ) is provided \ (X-_ {(I)} (I = 1, ..., n-) \ SIM N_p (\ MU, \ Sigma), (n-> P) \) , then ( \ (\ MU, \ Sigma \) ) 'S MLE IS:

\[\hat{\mu}=\overline{X}=\frac1n\sum_{i=1}^nX_{(i)}\\ \hat\Sigma=\frac1nA=\frac1n\sum_{i=1}^n(X_{(i)}-\overline{X})(X_{(i)}-\overline{X})' \]

( Theorem 2 ) if \ (\ overline {X}, A \) are \ (P \) membered normal population \ (N_p (\ mu, \ Sigma) \) of the sample mean vector, and the sample from the difference array, then:

  1. \(\overline{X}\sim N_p(\mu,\frac1n\Sigma)\);
  2. \ (A ^ = D \ sum_. 1 = {T} ^ {}. 1-n-Z_tZ_t '\) , wherein, \ (Z_1, ...,. 1-n-Z_} {\) independently with \ (N_p (0, \ Sigma) \) distribution;
  3. \ (\ overline {X}, A \) independent of each other;
  4. \(P\{A>0\}=1\Leftrightarrow n>p\).

Discuss when \ (A \) is a positive definite matrix;

  • One dollar a number of conclusions:

if \(X_1,...X_n\) iid \(N(\mu,\sigma^2)\)

  1. \(\overline{X}\sim N(\mu,\frac{\sigma^2}{n})\);
  2. \(\frac{\sum_{i-1}^n(X_i-\overline{X})^2}{\sigma^2}\sim\chi^2(n-1)\);
  3. \ (\ overline {X} \ ) and \ (s ^ 2 = \ frac1 {n-1} \ sum_ {i = 1} ^ n (X- \ overline {X}) ^ 2 \) independently;

prove:

Set \ (\ the Gamma \) of \ (n-\) Order orthogonal matrix (different product line is 0, 1 volume of the peer), has the following form:

\[\Gamma= \left[ \begin{array}{ccc} \gamma_{11}&\dots&\gamma_{1n}\\ \vdots&&\vdots\\ \gamma_{(n-1),1}&\dots&\gamma_{(n-1),n}\\ \frac1{\sqrt{n}}&\dots&\frac1{\sqrt{n}} \end{array} \right]=(\gamma_{ij})_{n\times n} \]

To \ (X-\) linear transformation \ (Z = \ Gamma X \ ) namely:

\[Z=\left[ \begin{array}{c} Z_1'\\ \vdots\\ Z_n' \end{array} \right]=\Gamma \left[ \begin{array}{c} X_{(1)}'\\ \vdots\\ X_{(n)}' \end{array} \right]=\Gamma X \]

then:

\[\begin{align} Z'=&X'\Gamma'\\\\ (Z_1,...,Z_n)=&(X_{(1)},...,X_{(n)})\Gamma'\\ Z_t=(X_{(1)},...,X_{(n)})&\left[ \begin{array}{c} \gamma_{t1}\\ \vdots\\ \gamma_{tn} \end{array} \right],(t=1,...,n) \end{align} \]

\ (Z_t \) of \ (P \) dimensional random vector, and a \ (P \) dimension normal RV \ (X-_ {(. 1)} ', ..., X-_ {(n-)}' \) of a linear combination, so \ (Z_t \) is \ (P \) dimensional normal random vector . And:

\[E(Z_t)=E(\sum_{i=1}^nr_{ti}X_{(i)})=\sum_{i=1}^nr_{ti}E(X_{(i)})=\mu\sum_{i=1}^nr_{ti}= \begin{cases} \mu\cdot(\frac1{\sqrt{n}}\sum_{i=1}^nr_{ti})\cdot\sqrt{n}&=0&当\,t\neq n\,时,\\ \mu\cdot n\frac1{\sqrt{n}}&=\sqrt{n}\mu&当\,t=n\,时\\ \end{cases} \]

\[\begin{align} Cov(Z_{\alpha},Z_{\beta})=&E[(Z_{\alpha}-E(Z_\alpha))(Z_{\beta}-E(Z_{\beta}))']=\sum_{i=1}^n(r_{\alpha i}r_{\beta i})\Sigma= \begin{cases} O&\alpha\neq\beta,\\ \Sigma&\alpha=\beta \end{cases} \end{align} \]

  • \(\overline{X}\sim N_p(\mu,\frac1n\Sigma)\);

\(Z_n=\frac1{\sqrt{n}}\sum_{\alpha=1}^nX_{(\alpha)}=\sqrt{n}\overline{X}\sim N_p(\mu\sqrt{n},\Sigma)\)

  • \ (A ^ = D \ sum_. 1 = {T} ^ {}. 1-n-Z_tZ_t '\) , wherein, \ (Z_1, ...,. 1-n-Z_} {\) independently with \ (N_p (0, \ Sigma) \) distribution;

\(\sum_{\alpha=1}^nZ_\alpha Z_\alpha'=(Z_1,...,Z_n)\left(\begin{array}{c}Z_1'\\\vdots\\Z_{n}'\end{array}\right)=Z'Z=X'\Gamma'\cdot\Gamma X=\sum_{\alpha=1}^nX_\alpha X_\alpha'\)

then:

\[\sum_{\alpha=1}^{n-1}Z_\alpha Z_\alpha'=\sum_{\alpha=1}^nX_\alpha X_\alpha'-Z_{n}Z_{n}'=\sum_{\alpha=1}^nX_\alpha X_\alpha'-n\overline{X}\overline{X}'=A \]

  • \ (\ overline {X}, A \) independent of each other;

\[A=g(\sum_{t=1}^{n-1}Z_tZ_t')\\ \overline{X}=f(Z_n) \]

And \ (Z_i, Z_j \) independently of one another ( \ (I \ NEQ J \) ), then \ (A, \ overline {X } \) is also independent of each other.

  • \(P\{A>0\}=1\Leftrightarrow n>p\).

\ (B = (Z_1, ...,. 1-n-Z_ {}) \) , denoted \ (A = BB '\) , since \ (A = BB' \) , \ (P \ Times (. 1-n- ) \) matrix, apparently \ (Rank (A) = Rank (B) \) , when the \ (A \) when positive definite matrix, \ (Rank (A) = P \) , so \ (rank (B) = P \) , so \ ((n--. 1) \ GEQ P \) , i.e. \ (n-> P \) .

(No bias)

\[E(\overline{X})=\left(\begin{array}{c} \frac1n\sum_{i=1}^nE(x_{i1})\\ \vdots\\ \frac1n\sum_{i=1}^nE(x_{ip})\\ \end{array}\right)= \left(\begin{array}{c} \mu_1\\\vdots\\\mu_p \end{array}\right)=\mu \]

\ [E (A) = E \ left (\ sum_ {i = 1} ^ {n-1} Z_iZ_i '\ right) = \ sum_ {i = 1} ^ {n-1} E (Z_iZ_i') = \ sum_ {i = 1} ^ {n-1} D (Z_i) = (n-1) \ Sigma \ neq \ Sigma \]

Therefore, \ (\ hat {\ Sigma} = \ frac1nA \) is not \ (\ Sigma \) unbiased estimation, should be amended to: \ (S = \ {n-frac1}. 1-A \) .

Guess you like

Origin www.cnblogs.com/rrrrraulista/p/12546064.html