Encyclopedia > Entropy encoding

  Article Content

Entropy encoding

An entropy encoding is a coding scheme that assigns codes to symbols so as to match code lengths with the probabilities of the symbols. Typically, entropy encoders are used to compress data by replacing symbols represented by equal-length codes with symbols represented by codes proportional to the negative logarithm of the probability. Therefore, the most common symbols use the shortest codes.

According to Shannon's theorem, the optimal code length for a symbol is -logbP, where b is the number of symbols used to make output codes and P is the probability of the input symbol.

Two of the most common entropy encoding techniques are Huffman coding and arithmetic encoding. If the approximate entropy characteristics of a data stream are known in advance (especially for signal compression), a simpler static code such as Unary coding, Elias Gamma coding, Fibonacci coding, Golomb coding, or Rice coding may be useful.

An earlier version of the above article was posted on PlanetMath (http://planetmath.org/encyclopedia/EntropyEncoding). This article is open content.

See also: entropy, Universal code[?]



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Islandia, New York

... under the age of 18, 6.9% from 18 to 24, 36.3% from 25 to 44, 25.1% from 45 to 64, and 7.0% who are 65 years of age or older. The median age is 36 years. For every 100 ...

 
 
 
This page was created in 43.8 ms