Encyclopedia > Channel capacity

  Article Content

Channel capacity

Channel capacity, shown often as "C" in communication formulas, is the amount of discrete information bits that a defined area or segment in a communications medium can hold. Thus, a telephone wire may be considered a channel in this sense. Breaking up the frequency bandwidth into smaller sub-segments, and using each of them to carry communications results in a reduction in the number of bits of information that each segment can carry. The total number of bits of information that the entire wire may carry is not expanded by breaking it into smaller sub-segments.

In reality, this sub-segmentation reduces the total amount of information that the wire can carry due to the additional overhead of information that is required to distinguish the sub-segments from each other.

Information Theory, developed by Claude E. Shannon in 1948, provides a mathematical model by which one can compute the maximal amount of information that can be carried by a channel.

Two additional considerations: The channel can be light beams, radio waves, a specified bandwidth, a book, or even elementary particles from whose state information may be gleaned.

In addition, an analog signal, such as the human voice, can be computed in terms of the elements of difference that can be detected, and this can be counted and considered "bits" of information.

See also: information theory, redundancy, Shannon capacity



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
UU

... University of Utah Union University[?] This is a disambiguation page; that is, one that just points to other pages that might otherwise have the same name. If you ...

 
 
 
This page was created in 24.7 ms