From Encyclopedia of Mathematics
Revision as of 16:55, 7 February 2011 by (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

A binary unit of information, numerically equal to the amount of information obtained during a trial with two mutually exclusive equally probable alternatives ():

the logarithms being taken to the base 2. The bit is the most frequently used unit, but other units of information are also employed — the "Hartley" or the "nit" , the definitions of which involve decimal and natural logarithms respectively.


The definition as given comes from information theory. In computer science the term "bit" usually refers to the representation of "0" or "1" by a suitable physical device which can be (exclusively) in two alternative states, or even to that device itself. Nits or Hartley's are unknown in the West.

How to Cite This Entry:
Bit. Encyclopedia of Mathematics. URL:
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article