Part I / Chapter 3 Information Theory

Information Theory : How much of a signal containing information to quantify. The basic idea is: an unlikely event actually happened, than a very likely event, can provide more information.

First, the basic ideas of information theory to quantify the information, there will be:

  1, an event likely to occur than the amount of information very little, and in extreme cases, to ensure that the event can happen there should be no amount of information.

  2, the event is less likely to occur with a higher amount of information.

  3, independent events should have incremental information, such as the amount of information a positive twice throwing a coin twice passed up throwing a coin should be twice the amount of information up front.

Second, since the amount of information and Shannon entropy.

Three, KL divergence:

  1, the definition: if x has two separate probability distribution P (x) and the same random variable Q (x), we can measure the difference between two distributions with the KL divergence: D KL (P || Q) = E P ~ X [a logP (X) -logQ (X)]

  2. Objective: to measure the additional information when we use one kind is designed to a probability distribution such that the minimum code length Q of the message, send a message containing a symbol probability distribution P produced desired.

  3, the cross-entropy: H (P, Q) = H (P) + D KL (P || Q) = - E X ~ P [logQ (X)]

 

Guess you like

Origin www.cnblogs.com/rainton-z/p/11617453.html