Shannon's theorem, which concerns
information entropy, was proved in
1948 by
Claude Shannon. It gives the theoretical maximum rate at which
error-free bits can be transmitted over a noisy
channel. That any such nonzero rate could exist was considered quite surprising at the time since no scheme was known that could achieve such reliable communication;
information theory, as we know it today, was born.
The most famous example of this is for the bandwidth-limited and power constrained channel in the presence of Gaussian noise, usually expressed in the form C = W log2(1 + S /N ), where C is the channel capacity in bits per second, W is the bandwidth in hertz, and S /N is the signal-to-noise ratio.
Reference
- C. E. Shannon, The Mathematical Theory of Information. Urbana, IL:University of IllinoisPress, 1949 (reprinted 1998).
All Wikipedia text
is available under the
terms of the GNU Free Documentation License