[1] to observe the phenomenon of life by law of large numbers and the central limit theorem

[1] to observe the phenomenon of life by law of large numbers and the central limit theorem

This article will be completed early, during which I wrote a small ten "obscure science", the process is repeated recollection, I can not help but reflect, in addition to getting to know the mathematics boring accident, if there are other readers gain? I read a book once wrote, not afraid of people do not know, for fear that you do not know that he did not know. This sentence makes me in my practice as well as the exchange of technology and human experience, when a deep feeling.

So, I started to reconstruct my article. In my eyes, my articles have a lot of bright spot, for example, say that the level of knowledge of deconstruction, from shallow to explain a complicated deep theorem, and the proof of the theorem repeated scrutiny, to be very accurate and fluent. But everyone does not seem cold, with a large vernacular to explain a theorem seems to be loved by readers, after all, in the era of information fragmentation, not everyone has the time to think about these illusory things.

Writing requires authors to organize your thoughts and articulate, good at writing so people must be good at thinking. Readers can not, because reading does not require any critical thinking: If from time to time to man-mark the credibility of itself to every word, perhaps the reader will blindly listen to a person of their own empty piano. To some extent, this reader is a victim; with their time and brain capacity, this is a pointless and then fade away thinking provides a living space.

Still I hope, in my few readers here can be thinking, be harvested, together immersed in the world of logic.

2020/2/22 rewrite blind jb going on at the time this article.

When we learn mathematical analysis, I have encountered the concept of an infinite series. For solving the limited portion and is very difficult, and if we scale rise to an infinite number, the problem but is simplified, for example:
\ [\ lim_ {n-\ to \ infty} (. 1 + X + \ FRAC {X ^ 2 } {2!} + \ cdots + \ frac {x ^ n} {n!}) = \ lim_ {n \ to \ infty} \ sum_ {i = 1} ^ {n} \ frac {x ^ i} {i !} = e ^ x \]
in probability and statistics in the world, has a similar phenomenon, with mathematical language to express, we can abstract the two types of problems:

  • With the increase of the number of trials, the frequency of events will converge to the probability of it?
  • Several random variable sum, then the distribution of this and what the limits will obey?

Here we explore with the previous pace step by step, the logic of truth above two statements.

1, the frequency will converge to the probability of it?

The so-called law of large numbers means that, for an event \ (A \) , we set the \ (n \) independent experiments, each observation if it happens, do this, we define a random variable \ (X_i, ( 1, 2 = i, \ DOTS, the n-) \) , so in this \ (n \) experiment, we can remember the event occurred total " \ (X_1 + X_2 + \ DOTS + X_n \) " times.

If we follow the known "frequency is close to a probability", we can draw the following formula:
\ [P (A) = \ {n-lim_ \ to \ infty P_n} = \ {n-lim_ \ to \ infty} \ FRAC {X_1 + X_2 + \ dots +
X_n} {n} = \ lim_ {n \ to \ infty} \ overline {X} _n \] If you use more to life the words to explain, we can imagine for a region's average income If we just go to investigate a person's income, he may be the real average income and with far remove, and when we count ten thousand after personal income and the arithmetic mean, the mean deviation from the overall average income It will be much smaller. The law of large numbers is a law of life, this made a generalization and proof from the theoretical level.

Bernoulli's law of large numbers

  • Set \ (X_1, X_2, \ dots , X_n, \ dots \) are random variables independent and identically distributed, remember their common mean \ (A \) , and set up the presence of their variance and denoted \ (\ sigma ^ 2 \) , then for any given \ (\ varepsilon> 0 \) , there is,

\ [\ Lim_ {n \ to \ infty} P (| \ frac {1} {n} \ sum_ {i = 1} ^ {n} X_i-a | \ ge \ varepsilon) = 0 (Bernoulli large numbers Theorem) \]

Called " \ (\ overline {} X-_n \) converges in probability a", to prove this theorem we need to introduce Markov probability inequality :

(Markov's inequality)
  • If non-negative random variable Y, is constant for any given \ (\ varepsilon> 0 \) , there

\ [P (Y \ ge \ varepsilon) \ le \ frac {E (Y)} {\ varepsilon} \ (Markov's inequality) \\ \]

证明:
\[ \begin{align} &\because E(Y)=\int_0^\infty yf(y)dy\ge\int_\varepsilon^\infty yf(y)dy\ge\varepsilon\int_\varepsilon^\infty f(y)dy=\varepsilon P(Y\ge\varepsilon)\\ \end{align}\\ \]

(Chebyshev's inequality)
  • If Var (Y) exists, then:

\ [P (| Y-EY | \ ge \ varepsilon) \ le \ frac {Var (Y)} {\ varepsilon ^ 2} (Chebyshev's inequality) \\ \]

Proved by Markov's inequality:
\ [\} P align = left {the begin (the Y \ GE \ varepsilon) \ & Leq \ FRAC {E (the Y) {} \ \\ varepsilon} P ([the Y-EY] ^ 2 \ GE \ varepsilon ^ 2) \ leq &
\ frac {E ([Y-EY] ^ 2)} {\ varepsilon ^ 2} \\ \ end {align} \\ \] to note \ (P (| y |> a) = P (Y ^ 2> A) \) , so
\ [P (Y \ ge \ varepsilon) \ leq \ frac {E ([Y-EY] ^ 2)} {\ varepsilon ^ 2} = \ frac {Var (Y)} {\ varepsilon ^ 2} \\ \]

  • See Chebyshev inequality is actually a special case of Markov's Inequality;)

By the above formula, provided \ (\ FRAC {. 1} {n-} \ sum_ {I =. 1} ^ {n-} X_i = \ overline {X-} _n \) , can be shown:
\ [\ the begin {align = left} \ lim_ { n \ to \ infty} P ( | \ frac {1} {n} \ sum_ {i = 1} ^ {n} X_i-a | \ ge \ varepsilon) & = \ lim_ {n \ to \ infty} P ( | \ overline {X} _n- a | \ ge \ varepsilon) \\ (Chebyshev) & \ leq \ lim_ {n \ to \ infty} \ frac {Var (\ overline X_n)} {\ varepsilon ^ 2} \\ (for variables iid and variance = sum of the variances) & = \ lim_ {n \ to \ infty} \ frac {1} {n ^ 2 \ varepsilon ^ 2} \ sum_ {i = 1} ^ {n } Var (X_i) \\ & = \ frac {1} {n ^ 2 \ varepsilon ^ 2} (n \ sigma ^ 2) \\ ( when n \ to \ infty time) & = \ frac {\ sigma ^ 2 } {n \ varepsilon ^ 2} \ to0 \ end {align} \]

  • This is the first of a law of large numbers - Bernoulli (Bernoulli) law of large numbers (1713) , both we often say " frequency convergence in probability ."

\[ \lim_{n\to\infty}P(|p_n-p|\geq\varepsilon)=0 \]

In life, "theorem", "Law", "given the" seem to have the same meaning, but in science in general, "theorem" represents the theoretical facts can prove through rigorous mathematical tool, "law" is the number "obvious" theory is difficult to prove.

2, and limiting distribution?

Law of large numbers is discussed under what conditions, the arithmetic mean of the random variable sequence converges in probability to the arithmetic average of the mean, and the central limit theorem is discussed under what conditions, and independent random variables:
\ [Y_n = \ sum_ {i = 1} ^ nX_i
\] distribution function converges to a normal distribution.

Convolution formula

\[ \begin{align} p_X(x)*p_Y(y)::=p_Z(z) =&\int_{-\infty}^\infty p_X(z-y)p_Y(y)dy\\ =&\int_{-\infty}^\infty p_X(x)p_Y(z-x)dx \end{align} \]

(Lindeberg-Lévy) Central Limit Theorem

Set \ (X_1.X_2, \ dots, X_n , \ dots \) is the random variable sequence independent and identically distributed, \ (E (X_i) = A, Var (X_i) = \ Sigma ^ 2 (0 <\ Sigma <\ infty) \) . Is any real number \ (X \) , there are:
\ [\ the begin {align = left} \ Phi (X) & = \ lim_ {n-\ to \ infty} P \ left (\ FRAC {. 1} {\ sqrt n-\ Sigma } (\ sum_ {i = 1 } ^ nX_i-na) \ leq x \ right) = \ frac {1} {\ sqrt {2 \ pi}} \ int _ {- \ infty} ^ xe ^ {- t ^ 2 / 2} dt \\ \ end { align} \]

Guess you like

Origin www.cnblogs.com/rrrrraulista/p/12255340.html