Namespaces
Variants
Actions

User:Richard Pinch/sandbox-15

From Encyclopedia of Mathematics
Jump to: navigation, search

Flow

A flow in an oriented graph $G$ with values in an Abelian group $A$ is a map $F$ from the edge set $E(G)$ to $A$ satisfying the Kirchhoff law that for each vertex $v$ $$ \sum_{e\rightarrow v} F(e) = \sum_{e\leftarrow v} F(e) \ , $$ that is, the flow into $v$ is equal to the flow out of $v$. A non-zero flow is one with the condition $F(e) \ne 0$ for all $e$.

The number $f_G(n)$ of non-zero flows on a graph $G$ with values in a finite group $A$ depends only on the order $n$ of $A$ and not its group structure, and a polynomial function of $n$, called the flow polynomial. This number does no depend on the orientation of the graph, so we may speak of the flow polynomial of an unoriented graph.

The flow polynomial is a contraction–deletion invariant: if $G/ e$ and $G\setminus e$ denote the contraction and deletion of $e$ respectively, then $$ f_{G/ e}(n) = f_G(n) + f_{G\setminus e}(n) \ . $$

Tutte's five-flow conjecture states that if $G$ is a bridgeless graph, then it has a non-zero $n$-flow for $n=6$. It is known that this holds for $n=6$.

This establishes the remarkable relation between the flow polynomial $f_G$ and the chromatic polynomial $p_G$: $$ f_G(n) = (-1)^{|V(G)|} p_G(-n). $$



Ranked poset

A partially ordered set $(X,{\le})$ with a rank function $r : X \rightarrow \mathbf{N}$ such that $r(m) = 0$ for some minimal element $m \in X$, and $r(q) = r(p)+1$ when $q$ covers $p$: that is, $p. < q$ and there is no $x$ with $p < x < q$.

A graded poset is one in which every minimal element has rank $0$ and every maximal element has the same rank.

References

  • K. Engel, Sperner Theory, Encyclopedia of Mathematics and its Applications 65, Cambridge (1997) ISBN 0-521-45206-6 Zbl 0868.05001

Cage

A trivalent graph with given girth and minimum number of vertices. A $k$-cage is a cage of girth $k$.

References

  • W.T. Tutte, "Connectivity in graphs", Mathematical Expositions 15, University of Toronto Press (1966) Zbl 0146.45603

Girth

One of the numerical characteristics of a graph: the size of the smallest cycle in an undirected graph which is not a forest.

References

  • W.T. Tutte, "Connectivity in graphs", Mathematical Expositions 15, University of Toronto Press (1966) Zbl 0146.45603

Moment generating function

Subgaussian distribution

A probability distribution with tails comparable to those of the Gaussian distribution. A random variable $X$ with mean $\mu$ is sub-Gaussian if there is a positive constant $\sigma$ such that $$ \mathsf{E}[\exp(\lambda(X-\mu))] \le \exp(\lambda^2\sigma^2/2) $$ for all real $\lambda$.

Clearly any normal random variable $N(\mu,\sigma^2)$ is sub-Gaussian with parameter $\sigma$.

A random variable is sub-Gaussian if and only if it has a moment generating function satisfying $$ \mathsf{E}[\exp(\lambda X)] \le \exp(\lambda^2\sigma^2/2) $$ for all real $\lambda$.

References

  • Martin J. Wainwright, "High-dimensional Statistics", Cambridge (2019) ISBN 978-1-108-49802-9

Markov inequality in probability theory

A simple inequality giving a bound on the probability of deviation of a random variable with finite expectation from a given value. Let $X(\omega)$ be a random variable. Then $$ \mathsf{P}\{|X| > \epsilon\} \le \frac{\mathsf{E}X}{\epsilon} \ . $$

The inequality follows from comparing the indicator function of the set $\{\omega : |X(\omega)| > \epsilon\}$ with the random variable $X/\epsilon$ and then taking expectations.

Application of Markov's inequality to the random variable $Y = (X-\mu)^2$ yields the Chebyshev inequality in probability theory. More generally, if $X$ has a central moment of order $k$ about $\mu$, then $$ \mathsf{P}\{|X-\mu| > \epsilon\} \le \frac{\mathsf{E}[|X-\mu|^k]}{\epsilon^k} \ . $$

If $X-\mu$ has a moment generating function in the neighbourhood $[-b,b]$ of zero, then for $0 \le \lambda \le b$ $$ \mathsf{P}\{|X-\mu| > \epsilon\} = \mathsf{P}\{\exp(\lambda|X-\mu|) > \exp(\lambda\epsilon)\} \le \frac{\mathsf{E}[\exp(\lambda|X-\mu|)]}{\exp(\lambda\epsilon)} $$ so that we obtain the Chernoff inequality $$ \log \mathsf{P}\{|X-\mu| > \epsilon\} \le \inf_{0\le\lambda\le b} \log \mathsf{E}[\exp(\lambda|X-\mu|)] - \lambda\epsilon \ . $$


References

  • Geoffrey Grimmett, David Stirzaker, "Probability and Random Processes" (4th ed), Oxford (2020) ISBN 0-19-884759-9
  • Martin J. Wainwright, "High-dimensional Statistics", Cambridge (2019) ISBN 978-1-1-8-49802-9


Steinitz exchange lemma

A lemma on linearly independent sets and spanning sets in a vector space from which it is possible to deduce that dimension is well-defined.

Let $X = \{x_1,\ldots,x_m\}$ be a linearly independent set and $Y = \{y_1,\ldots,y_n\}$ a spanning set in a vector space $V$. Then $m \le n$ and $Y$ can be re-ordered so that $\{x_1,\ldots,x_m,y_{m+1},\ldots,y_n$ is also a spanning set.

As a corollary, if $X$ and $Y$ are both bases (linearly independent and spanning) for $V$, then $n = m$. Hence all bases for $V$ have the same size, and this shows that the dimension of $V$ is well-defined.

References

  • P. M. Cohn, "Classic Algebra" Wiley (2000) ISBN 047187731X

Hopf constant

A constant $q_\infty$ associated with the Milne problem in radiative transfer theory. It may be defined as $$ q_\infty = \frac{1}{\pi} \int_0^{\pi/2} \left({\frac{3}{\sin^2\theta} - \frac{1}{1-\theta\cot\theta}} \right) d\theta \ . $$ Its numerical value is $$ q_\infty = 0.7104460895 \dots\ . $$

References

  • Steven R. Finch, Mathematical Constants, Cambridge University Press (2003) ISBN 0-521-81805-2 Zbl 1054.00001

Hoeffding inequality

A simple inequality giving a bound on the probability of deviation of a sum of independent random variables from a given value. Let $X_1,\ldots,X_n$ be independent sub-Gaussian random variables with means $\mu_i$ and sub-Gaussian parameters $\sigma_i$. Then $$ \mathsf{P}\left\{{\sum_{i=1}^n (X_i-\mu_i) > t }\right\} < \exp\left({-\frac{t^2}{2\sum_{i=1}^n \sigma_i^2} }\right) \ . $$ In particular, if the $X_i$ are bounded, $|X_i|<b$ then they are sub-Gaussian with parameter $b$ and the inequality takes the form $$ \mathsf{P}\left\{{\sum_{i=1}^n (X_i-\mu_i) > t }\right\} < \exp\left({-\frac{t^2}{2 n b^2} }\right) \ . $$

References

  • Martin J. Wainwright, "High-dimensional Statistics", Cambridge (2019) ISBN 978-1-108-49802-9
How to Cite This Entry:
Richard Pinch/sandbox-15. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Richard_Pinch/sandbox-15&oldid=51733