Bit
A binary unit of information, numerically equal to the amount of information obtained during a trial with two mutually exclusive equally probable alternatives ( $ p _ {1} = p _ {2} = 1/2 $):
$$ I (p _ {1} , p _ {2} ) = \ p _ {1} \mathop{\rm log} _ {2} p _ {1} - p _ {2} \mathop{\rm log} _ {2} p _ {2} = \ \mathop{\rm log} _ {2} 2 = 1 \ \ \textrm{ \BQT bit\EQT } , $$
the logarithms being taken to the base 2. The bit is the most frequently used unit, but other units of information are also employed — the "Hartley" or the "nit" , the definitions of which involve decimal and natural logarithms respectively.
Comments
The definition as given comes from information theory. In computer science the term "bit" usually refers to the representation of "0" or "1" by a suitable physical device which can be (exclusively) in two alternative states, or even to that device itself. Nits or Hartley's are unknown in the West.
Bit. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Bit&oldid=11378