Ninth knowledge: What information entropy and Shannon (Shannon) is defined?

Ninth knowledge: What information entropy and Shannon (Shannon) is defined

This is the last one computer theory, we discuss the basic concepts of information theory, entropy and information is what Shannon defined.

Information Theory was established in 1948 Claude E.Shannon. Most information theory began to be used in signal processing, but after decades of development, it has now been applied to the various disciplines. This article attempts to introduce simple two basic concepts , entropy (entropy) and information (information). If you do, I personally recommend interested in this you are here to learn more. [1]

entropy

Entropy is a measure of one or more variables measure of uncertainty.

Suppose we investigate first page when people open the browser open. We will use sampling method to separate two groups of testers. Four cryptography researchers from Bristol Cryptogroup and four passengers at Bristol bus station is extracted let's do a radical hypothesis, assuming four cryptography researchers for the first time will visit http://bristolcrypto.blogspot.co.uk/ .

Now let us evaluate their answers: Obviously, cryptographers answer is quite certain (low uncertainty), whereas if the answer from the passenger, it is difficult to guess (high uncertainty) In other words, we are. the answer, said low entropy group of cryptographers, and answer passenger group of high entropy.

Therefore, the contribution is Shannon Shannon entropy of a famous definition:

\(H = - \sum_ip_ilog_bp_i\)

Where \ (p_i \) is a possibility before the answer appears. In computer science, we usually use the \ (b = 2 \) (bits).

If we calculate the entropy, we have

\(H_{cryptographer} = - \sum_i^41log_21=0\)

\(H_{passenger} = -\sum_1^4log_2(1/4)=2\)

So the answer to the entropy of the passengers indeed higher than the scientist password!

information

Formally, Shannon definition information is given in [2]:

Information is a measure of a person freedom of choice when selecting information.

To explain this, let us make a small modification to the front of the case. Let Bristol train station and then arrested four passengers, assuming that their answers are also randomly portal, just as the same as the passenger bus station.

The question is: given an answer \ (the y-\) , you can say the answer from which group?

If \ (y \) is http://bristolcrypto.blogspot.co.uk/ , then we can immediately know the answer from our group password coders. But if y is random, we will encounter difficulties. So we can say http://bristolcrypto.blogspot.co.uk/ contain more information than random.

So they have to do with entropy?

Expand the definition of entropy, we will be conditional entropy is defined as:

\[ H(Y|X) = sum_{x \in X}p(x)H(Y|X=x) \]

This formula describes as \ (X = x \) conditions \ (Y \) entropy. More specifically, because the entropy is the uncertainty of a variable. Therefore, the previously defined conditional entropy is actually when a given condition as "clues" (condition) \ (the X-\) uncertain \ (the Y-\) .

Observation: Consider two variables \ (X-\) and \ (the Y \) . If \ (X-\) comprising \ (the Y \) is the minimum information, and then gives an additional \ (X-\) the precise value of our inferred \ (Y \) values should not be much help, that is to say, it does not significantly reduce \ (Y \) uncertainty. On the other hand, if \ (X \) contains \ (Y \) basic information. So when \ (X \) to the timing, \ (Y \) entropy should be much lower. Therefore, the conditional entropy can be regarded as seen as \ (X \) to \ (Y \) the information is a reasonable measure!

Another important indicator is the mutual information (Mutual Information). It is a measure of two variables measured. A way is to reduce its definition of entropy.

\(I(X;Y) = H(X)-H(X|Y)=H(Y)-H(Y|X)\)

Cryptography examples

Information theory concepts widely used in cryptography is a typical example of cryptography to as a channel, input plaintext, ciphertext is output. Study also benefited from the channel side information theory.

[1] Thomas M. Cover and Joy A. Thomas. Elements of Information Theory
​ 2nd Edition. Wiley-Interscience, 2 edition, July 2006.

[2] S. Vajda, Claude E. Shannon, and Warren Weaver. The mathematical
​ theory of communication. The Mathematical Gazette, 34(310):312+,
​ December 1950.

[3] http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

Guess you like

Origin www.cnblogs.com/zhuowangy2k/p/11892615.html