Jump to content

Computer word

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 217.88.80.70 (talk) at 03:05, 6 May 2005 (+ he). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A computer word is a measurement of the size of the "natural" amount of computer memory a particular computer uses. For instance, many early computers used 36 bits in a word, that is, the computer would read and write 36 bits at a time. This number was based on the common need to store floating-point numbers of 7-digit precision, while, at the same time, it was common to use 6-bit characters.

A 36-bit machine would handle 6 characters at a time, that caused the early versions of Fortran to use 6-character identifiers (6×6=36). Lower cost minicomputers would use 12 bit words. The CDC 6600 supercomputers used 60 bit words. The Soviet BESM-6 mainframes used 48 bit words.

Today the 6-bit character has largely disappeared, and the basic unit for computer words is 8-bits, or a byte. This change occurred when computers became more commonly used for text processing, which required 7 or 8 bits to store an ASCII character. The first machine to widely introduce the 8-bit multiple for words was the IBM System/360 in 1964, and it quickly took over the entire market.

Today the term "word" is rarely used, and instead we simply refer to the number of bits. For instance most common CPUs today use a 32-bit word, but we refer to them as "32-bit processors".