Jump to content

Computer word

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Maury Markowitz (talk | contribs) at 17:00, 17 November 2002 (spellin). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A computer word is a measurement of the size of the "natural" amount of computer memory a particular computer uses. For instance, many early computers used 36 bits in a word, that is, the computer would read and write 36 bits at a time. This number was based on the then-common need to store decimal numbers effeciently, and it was common to use 6-bit binary coded decimal numbers for this task. A 36-bit machine would handle 6 of these digits at a time, and lower cost machines would typically use 12, 18 or 24 bit words instead.

Today the 6-bit digit has largely disappeared, and the basic unit for computer words is 8-bits, or a byte. This change occured when computers became more commonly used for text processing, which required 7 or 8 bits to store an ASCII character. The first machine to widely introduce the 8-bit multiple for words was the IBM 360 in the 1960s, and it quickly took over the entire market.

Today the term "word" is rarely used, and instead we simply refer to the number of bits. For instance most common CPUs today use a 32-bit word, but we refer to them as "32-bit processors".