## Encyclopedia > Information Theory

Article Content

# Information theory

Redirected from Information Theory

This article is not about Library and Information Science nor about Information Technology.

Information theory is a branch of the mathematical theory of probability and mathematical statistics, that deals with the concepts of information and information entropy, communication systems, data transmission and rate distortion theory, cryptography, signal-to-noise ratios, data compression, and related topics.

Claude E. Shannon (1916-2001) has been called "the father of information theory" (ISBN 0252725484). His theory "considered the transmission of information as a statistical phenomenon" and gave communications engineers a way to determine the capacity of a communication channel in terms of the common currency of bits. The transmission part of the theory is not "concerned with the content of information or the message itself," though the complementary wing of information theory concerns itself with content through lossy compression of messages subject to a fidelity criterion. These two wings of information theory are joined together and mutually justified by the information transmission theorems, or source-channel separation theorems that justify the use of bits as the universal currency for information in many contexts.

It is generally accepted that the modern discipline of information theory began with the publication by Claude E. Shannon of his article "The Mathematical Theory of Communication" in the Bell System Technical Journal in July and October of 1948. This work drew on earlier publications by Harry Nyquist and Ralph Hartley. In the process of working out a theory of communications that could be applied by electrical engineers to design better telecommunications systems, Shannon defined a measure of entropy:

$H = - \sum_i p_i \log p_i$

that, when applied to an information source, could determine the capacity of the channel required to transmit the source as encoded binary digits. Shannon's measure of entropy came to be taken as a measure of the information contained in a message, as opposed to the portion of the message that is strictly determined (hence predictable) by inherent structures, like for instance redundancy in the structure of languages or the statistical properties of a language relating to the frequencies of occurrence of different letter or word pairs, triplets etc. See Markov chains.

Entropy as defined by Shannon is closely related to entropy as defined by physicists. Boltzmann and Gibbs did considerable work on statistical thermodynamics. This work was the inspiration for adopting the term entropy in information theory. There are deep relationships between entropy in the thermodynamic and informational senses. For instance, Maxwell's demon needs information to reverse thermodynamic entropy and getting that information exactly balances out the thermodynamic gain that the demon would otherwise achieve.

Among other useful measures of information is mutual information, a measure of the correlation between two event sets. Mutual information is defined for two events $X$ and $Y$ as

$I(X, Y) = H(X) + H(Y) - H(X, Y)$

where $H(X, Y)$ is the joint entropy or

$H(X, Y) = - \sum_{x, y} p(x, y) \log p(x, y)$

Mutual information is closely related to the log-likelihood ratio test for multinomials and to Pearson's χ2 test.

Claude E. Shannon's original paper is available at http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

All Wikipedia text is available under the terms of the GNU Free Documentation License

Search Encyclopedia
 Search over one million articles, find something about almost anything!

Featured Article
 Louis Prima ... Orleans Gang. He moved to New York in 1934, working regularly on 52nd Street[?]. His 1936 composition "Sing, Sing Sing" became one of the bigest hits and most covered ...