From Encyclopedia of Mathematics
Jump to: navigation, search

Conventional signs used to denote numbers (cf. Number). The earliest and most primitive method is the verbal notation of numbers, which in isolated cases was preserved for a fairly long time. (For example, some mathematicians in the Middle and Far East systematically used the verbal writing of numbers up to the 10th century and even later.) With the development of the social and economic life of the people there arose a need to create a more modern notation for numbers than the verbal writing and to work out the principles of writing numbers — systems for the representation of numbers (cf. Numbers, representations of).

The oldest ciphers known to us are those of Babylonia and Egypt. The Babylonian ciphers (2000 B.C. — the beginning of our era) are cuneiform symbols for the numbers $1$, $10$, $100$ (or only for $1$ and $10$), all other natural numbers are written down by means of combinations of them. In the Egyptian hieroglyphic numeration (which dates from about 3000–2500 B.C.) there were individual signs to denote the powers of ten (up to $10^7$).

Number systems of the type of the Egyptian hieroglyphics were used in Finland, Syria and Greek Attica. The origins of the Attic number system fall into the 6th century B.C.; the number system was used in Attica up to the 1st century A.D., although in other Greek countries it was displaced long before by the more convenient alphabetic Ionic number system, in which the units, tens and hundreds are denoted by letters of the Greek alphabet and all remaining numbers up to 999 by combinations of them (the first notations of numbers in this system go back to the 5th century B.C.). The alphabetic notation of numbers was also used by other people, for example, in Arabia, Syria, Palestine, Georgia, and Armenia. The old Russian number system (which arose around the 10th century and lasted until the 16th century) was also alphabetical (see Slavic numerals). The most long-lived of the ancient cipher systems turned out to be the Roman numeration, which had its origin around 500 B.C. with the Etruscans; even nowadays it is used occasionally (see Roman numerals).

The prototypes of the modern ciphers (including 0) appeared in India, probably not later than the 5th century B.C.. The convenience of writing numbers by means of these ciphers in the decimal position system was the reason for the spreading from India to other countries. In Europe the Indian ciphers were taken up by the Arabs in the 10th–13th centuries (hence, to this day their other name is preserved: "Arabic" ciphers) and obtained universal acceptance by the second half of the 15th century. The form of the Indian ciphers underwent in time a number of important changes; their early history is not well known.

For references, see Numbers, representations of.


For more details and a discussion and description of the various ciphers used, e.g. in the hieroglyphic systems, as well as a discussion of the origin of the zero symbol — concerning which there is still much uncertain — cf. also [a1], especially p. 11 ff, p. 64 ff, p. 234 ff.

The word cipher also is used to denote a cryptographic system and for the key to such a system, cf. Cryptology.


[a1] C.B. Boyer, "A history of mathematics" , Wiley (1968)
How to Cite This Entry:
Ciphers. Encyclopedia of Mathematics. URL:
This article was adapted from an original article by V.I. Bityutskov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article