Information Theory is a risky business because its subject: "information" is extremely enigmatic, its essential nature is hidden in the fog of fundamental philosophy of life and reality.
While most cryptographic uses of information may enjoy practical clarity, the theoretical vagueness of the concept is always looming in the background.
Information Theory deals with the measurement and efficient movement of useful information.
Information theory was invented by Claude Shannon (1948) through his brilliant insight that information should be measured by its impact.
Shannon observed that information that reduces chaos to order has a greater measure than information that elevated an orderly situation to a bit more orderly one.
The amount of information in the message "It will snow tomorrow!" is greater if it refers to summer day in Miami than when it refers to a winter day in New York.
Shannon contrived a mathematical expression that captures the probability distribution across a range of possibilities, arguing that information is something that reduces this distribution to a delta function, and he named that expression entropy -- the impact of information.
Entropy measures the confusion, the uncertainty that keeps our adversary from realizing what we realize, reading what we do. It measured the intractability erosion of our chosen cipher.
|
|