Encyclopedia > Tokenize

  Article Content

Tokenize

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

The term is most commonly used in computers, where a programming language source code, a set of symbols in an english-like format, is converted into another format that is much smaller. Most BASIC interpreters used this to save room, a command such as print would be replaced by a single number which uses much less room in memory. In fact most lossless compression systems use a form of tokenizing, although it's typically not referred to as such.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Sanskrit language

... of this language family by Sir William Jones, and thus played an important role in the development of linguistics. Indeed, linguistics (along with phonology, etc.) was first ...

 
 
 
This page was created in 36.1 ms