# Uncertainty principle, mathematical

The following meta-theorem: It is not possible for a non-trivial function and its Fourier transform to be simultaneously sharply localized/concentrated.

Depending on the definition of the term "concentration" , one gets various concrete manifestations of this principle, one of them (see the Heisenberg uncertainty inequality below), correctly interpreted, is in fact the celebrated Heisenberg uncertainty principle of quantum of mechanics in disguise ([a13]).

A comprehensive discussion of various (mathematical) uncertainty principles can be found in [a10].

## Heisenberg uncertainty inequality.

Defining concentration in terms of standard deviation leads to the Heisenberg uncertainty inequality. If $f \in L ^ { 2 } ( \mathbf{R} )$ and $a \in \bf R$, the quantity $\int | x - a | ^ { 2 } | f ( x ) | ^ { 2 } d x$ is a measure of the concentration of $f$ around $a$. Roughly speaking, the more concentrated $f$ is around $a$, the smaller will this quantity be. If one normalizes $f$ such that $\| f \| _ { 2 } = 1$, then by the Plancherel theorem $\| \widehat { f } \| _ { 2 } = 1$. Here, $\hat { f }$ is the Fourier transform of $f$, defined by

\begin{equation*} \widehat { f } ( y ) = \int _ { - \infty } ^ { \infty } f ( x ) e ^ { - 2 \pi i x y } d x, \end{equation*}

the convergence of the integral being interpreted suitably. Then, for $a , b \in \bf R$ one has the Heisenberg inequality

\begin{equation*} ( \int _ { - \infty } ^ { \infty } ( x - a ) ^ { 2 } | f ( x ) | ^ { 2 } d x ) ( \int _ { - \infty } ^ { \infty } ( y - b ) ^ { 2 } | \hat { f } ( y ) | ^ { 2 } d y ) \geq \end{equation*}

\begin{equation*} \geq \frac { 1 } { 16 \pi ^ { 2 } }. \end{equation*}

Thus, the above says that if $f$ is concentrated around $a \in \bf R$, then no matter what $b \in \mathbf{R}$ is chosen, $\hat { f }$ cannot be concentrated around $b$. Equality is attained in the above if and only if $f$ is, modulo translation and multiplication by a phase factor, a Gaussian function (i.e. of the form $K e ^ { - c x ^ { 2 } }$).

## Benedicks' theorem.

Concentration can also be measured in terms of the "size" of the set on which $f$ is supported (cf. also Support of a function). If one takes "size" to mean Lebesgue measure, then M. Benedicks ([a4], [a1]) has proved the following result: If $f \in L ^ { 2 } ( \mathbf{R} )$ is a non-zero function, then it is impossible for both $A = \{ x : f ( x ) \neq 0 \}$ and $B = \left\{ y : \widehat { f } ( y ) \neq 0 \right\}$ to have finite Lebesgue measure. (This is a significant generalization of the fact, well known to communication engineers, that a function cannot be both time limited and band limited.) For various other uncertainty principles of this kind, see [a11].

## Hardy's uncertainty principle.

Another natural way of measuring concentration is to consider the rate of decay of the function at infinity. A result of G.H. Hardy [a12] states that both $f$ and $\hat { f }$ cannot be simultaneously "very rapidly decreasing" . More precisely: If $| f ( x ) | \leq A e ^ { - \pi a x ^ { 2 } }$, $| \widehat { f } ( y ) | \leq B e ^ { - \pi b y ^ { 2 } }$ for some positive constants $A$, $a$, $B$, $b$ and for all $x , y \in \mathbf{R}$, and if $a b > 1$, then $f \equiv 0$. (If $a b < 1$, then there are infinitely many linearly independent functions $f$ satisfying the inequalities, and if $a b = 1$, then $f$ must be necessarily a Gaussian function.) Actually, the first part of Hardy's result can be deduced from the following more general result of A. Beurling [a14]: If $f \in L ^ { 2 } ( \mathbf{R} )$ is such that

\begin{equation*} \int _ { - \infty } ^ { \infty } \int _ { - \infty } ^ { \infty } |\, f ( x ) | \left|\, \hat { f } ( y ) \right| e ^ { 2 \pi | xy | } < \infty, \end{equation*}

then $f \equiv 0$. There are various refinements of Hardy's theorem (see [a6] for one such refinement).

## Other directions.

Apart from the three instances of the mathematical uncertainty principle described above, there are a host of uncertainty principles associated with different ways of measuring concentration (see, e.g., [a2], [a3], [a5], [a7], [a8], [a9], [a15], [a16], [a18], [a19]).

If $G$ is a locally compact group (including the case $G = \mathbf{R} ^ { n }$), then it is possible to develop a Fourier transform theory for functions defined on $G$ (cf. also Harmonic analysis, abstract). There is a considerable body of literature devoted to deriving various uncertainty principles in this context also. (See the bibliography in [a10].)

The Fourier inversion formula can be thought of as an eigenfunction expansion with respect to the standard Laplacian (cf. also Laplace operator; Eigen function). So it is natural to seek uncertainty principles associated with other eigenfunction expansions. Although this has not been as systematically developed as in the case of standard Fourier transform theory, there are several results in this direction as well (see [a17] and the bibliography in [a10]).

How to Cite This Entry:
Uncertainty principle, mathematical. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Uncertainty_principle,_mathematical&oldid=49898
This article was adapted from an original article by A. Sitaram (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article