# Vapnik-Chervonenkis dimension

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Vapnik–Červonenkis dimension, VC-dimension

Let $H = (V_H,E_H)$ be a hypergraph. The Vapnik–Chervonenkis dimension of $H$ is the largest cardinality of a subset $F$ of $V_H$ that is scattered by $E_H$, i.e. such that for all $A \subseteq F$ there is an $E \in E_H$ with $A = F \cap E$. Thus, it is the same as the index of a Vapnik–Chervonenkis class. It is usually denoted by $\mathrm{VC}(H)$.

Computing the Vapnik–Chervonenkis dimension is $\mathcal{NP}$-hard (cf. also $\mathcal{NP}$) for many classes of hypergraphs, [a1], [a2].

The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in probably approximately correct (PAC) learning. Thus, learnability of classes of $\{0,1\}$-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [a3].

For the role of the Vapnik–Chervonenkis dimension in neural networks, see, e.g., [a4], [a5].

The independence number of a hypergraph $H$ is the maximal cardinality of a subset $A$ of $V_H$ that does not contain any $E \in E_H$ (see also Graph, numerical characteristics of a). This notion is closely related with $\mathrm{VC}(H)$, [a6], [a7].

How to Cite This Entry:
Vapnik-Chervonenkis dimension. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Vapnik-Chervonenkis_dimension&oldid=39971
This article was adapted from an original article by M. Hazewinkel (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article