Namespaces
Variants
Actions

Bit

From Encyclopedia of Mathematics
Jump to: navigation, search


A binary unit of information, numerically equal to the amount of information obtained during a trial with two mutually exclusive equally probable alternatives ( $ p _ {1} = p _ {2} = 1/2 $):

$$ I (p _ {1} , p _ {2} ) = \ p _ {1} \mathop{\rm log} _ {2} p _ {1} - p _ {2} \mathop{\rm log} _ {2} p _ {2} = \ \mathop{\rm log} _ {2} 2 = 1 \ \ \textrm{ \BQT bit\EQT } , $$

the logarithms being taken to the base 2. The bit is the most frequently used unit, but other units of information are also employed — the "Hartley" or the "nit" , the definitions of which involve decimal and natural logarithms respectively.

Comments

The definition as given comes from information theory. In computer science the term "bit" usually refers to the representation of "0" or "1" by a suitable physical device which can be (exclusively) in two alternative states, or even to that device itself. Nits or Hartley's are unknown in the West.

How to Cite This Entry:
Bit. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Bit&oldid=46076
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article