Measure and the role of information
- Claude Elwood Shannon: founder of information theory, BA University of Michigan, MIT PhD. In 1948 published a landmark paper - Mathematical Theory of Communication, it laid the foundation of modern information theory.
-
Unit of information: bits (bit)
- Example: 32 teams compete for the World Cup
-
If you do not know any information about the team, each team have an equal probability crown.
To predict dichotomy, it requires a minimum of five times in order to predict the exact result. 5 = log32 (base 2)
. 5 = - (. 1 / 32log1 / 32. 1 + / 32log1 / 32 + ......) -
Open up some information, less than 5bit, such as Germany, 1/6, 1/6 Brazil, China 1/10
5> - (1 / 6log1 / 4 + 1 / 6log1 / 4 + ....)
- Entropy:
- "Who is the World Cup," the amount of information that should be less than 5 bit, it's accurate information should be:
- H = - (p1logp1 + p2logp2 + p3logp3 + ...... p32logp32) Pi is the probability of the winning team i
- H terminology is entropy in bits