Sign-value notation
In Computers
Sign-value notation in computers is the use of the high-order bit (left end) of a binary word to represent the numeric sign: 0 for +, 1 for - followed by a binary number that is an absolute magnitude or a 2's complement of an absolute magnitude. For example: 10011 means minus 3. 01001 means plus 9. 10110 means minus 9 in 2's complement.
In Ancient Number Systems
Sign-value notation represents numbers by a series of signs that added together equal the number represented. In Roman numerals for example, X means ten and L means fifty. Hence LLXXX means one hundred and thirty (50 + 50 + 10 + 10 + 10). There is no need for zero in sign-value notation. Sign-value notation was the pre-historic way of writing numbers and only gradually evolved into place-value notation. If pre-historic people wanted to write "two sheep" in clay, they would inscribe in the clay a picture of two sheep. This became impractical when they wanted to write "twenty sheep". In Mesopotamia they used small clay tokens and strung them like beads on a string. There was a token for one sheep and a token for ten sheep, etc. To insure that nobody could alter the number and type of tokens, they invented a clay envelope shaped like a hollow ball into which the tokens on a string were placed. If anybody contested the number, they could brek open the clay envelope and do a recount. To aboid damaging the evidence, they wrote archaic number signs on the outside of the envelope, each sign similar in shape to the tokens they represented. Since there was seldom any need to break open the envelope, the signs on the outside because the first written language for writing numbers.