Encyclopedia > Computer word

  Article Content

Computer word

A computer word is a measurement of the size of the "natural" amount of computer memory a particular computer uses. For instance, many early computers used 36 bits in a word, that is, the computer would read and write 36 bits at a time. This number was based on the then-common need to store decimal numbers efficiently, and it was common to use 6-bit binary coded decimal numbers for this task. A 36-bit machine would handle 6 of these digits at a time, and lower cost machines would typically use 12, 18 or 24 bit words instead.

Today the 6-bit digit has largely disappeared, and the basic unit for computer words is 8-bits, or a byte. This change occurred when computers became more commonly used for text processing, which required 7 or 8 bits to store an ASCII character. The first machine to widely introduce the 8-bit multiple for words was the IBM 360 in the 1960s, and it quickly took over the entire market.

Today the term "word" is rarely used, and instead we simply refer to the number of bits. For instance most common CPUs today use a 32-bit word, but we refer to them as "32-bit processors".



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Monty Woolley

... acting on Broadway in 1936. He was typecast as the wasp-tongued, supercillious sophisticate. His most famous role is that of the cranky professor forced to stay ...

 
 
 
This page was created in 23.5 ms