Encyclopedia > Tokenize

  Article Content

Tokenize

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

The term is most commonly used in computers, where a programming language source code, a set of symbols in an english-like format, is converted into another format that is much smaller. Most BASIC interpreters used this to save room, a command such as print would be replaced by a single number which uses much less room in memory. In fact most lossless compression systems use a form of tokenizing, although it's typically not referred to as such.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
French resistance

... did not always agree with him, Charles De Gaulle organized his own intelligence organization Bureau Central de Renseignements et d'Action (BCRA). There was also ...

 
 
 
This page was created in 38.1 ms