Encyclopedia > Tokenize

  Article Content

Tokenize

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

The term is most commonly used in computers, where a programming language source code, a set of symbols in an english-like format, is converted into another format that is much smaller. Most BASIC interpreters used this to save room, a command such as print would be replaced by a single number which uses much less room in memory. In fact most lossless compression systems use a form of tokenizing, although it's typically not referred to as such.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Reformed churches

... denominations in the world, the PRC separated from the CRC in the 1920s in a schism over the issue of common grace. Reformed Church in the United States[?] (German ...

 
 
 
This page was created in 27 ms