Namespaces
Variants
Actions

Difference between revisions of "Vapnik-Chervonenkis dimension"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(Tex done)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 
''Vapnik–Červonenkis dimension, VC-dimension''
 
''Vapnik–Červonenkis dimension, VC-dimension''
  
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300101.png" /> be a [[Hypergraph|hypergraph]]. The Vapnik–Chervonenkis dimension of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300102.png" /> is the largest cardinality of a subset <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300103.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300104.png" /> that is scattered by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300105.png" />, i.e. such that for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300106.png" /> there is an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300107.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300108.png" />. Thus, it is the same as the index of a [[Vapnik–Chervonenkis class|Vapnik–Chervonenkis class]]. It is usually denoted by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v1300109.png" />.
+
Let $H = (V_H,E_H)$ be a [[hypergraph]]. The Vapnik–Chervonenkis dimension of $H$ is the largest cardinality of a subset $F$ of $V_H$ that is scattered by $E_H$, i.e. such that for all $A \subseteq F$ there is an $E \in E_H$ with $A = F \cap E$. Thus, it is the same as the index of a [[Vapnik–Chervonenkis class]]. It is usually denoted by $\mathrm{VC}(H)$.
  
Computing the Vapnik–Chervonenkis dimension is <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001011.png" />-hard (cf. also [[NP|<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001012.png" />]]) for many classes of hypergraphs, [[#References|[a1]]], [[#References|[a2]]].
+
Computing the Vapnik–Chervonenkis dimension is $\mathcal{NP}$-hard (cf. also [[NP|$\mathcal{NP}$]]) for many classes of hypergraphs, [[#References|[a1]]], [[#References|[a2]]].
  
The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in PAC learning (probably approximately correct learning). Thus, learnability of classes of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001013.png" />-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [[#References|[a3]]].
+
The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in probably approximately correct ([[PAC]]) learning. Thus, learnability of classes of $\{0,1\}$-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [[#References|[a3]]].
  
For the role of the Vapnik–Chervonenkis dimension in neural networks, see, e.g., [[#References|[a4]]], [[#References|[a5]]].
+
For the role of the Vapnik–Chervonenkis dimension in [[neural network]]s, see, e.g., [[#References|[a4]]], [[#References|[a5]]].
  
The independence number of a hypergraph <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001014.png" /> is the maximal cardinality of a subset <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001015.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001016.png" /> that does not contain any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001017.png" /> (see also [[Graph, numerical characteristics of a|Graph, numerical characteristics of a]]). This notion is closely related with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001018.png" />, [[#References|[a6]]], [[#References|[a7]]].
+
The independence number of a hypergraph $H$ is the maximal cardinality of a subset $A$ of $V_H$ that does not contain any $E \in E_H$ (see also [[Graph, numerical characteristics of a]]). This notion is closely related with $\mathrm{VC}(H)$, [[#References|[a6]]], [[#References|[a7]]].
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  E. Kranakis,  D. Krizanc,  B. Ruf,  J. Urrutia,  G. Wöginger,  "The VC-dimension of set systems defined by graphs"  ''Discr. Appl. Math.'' , '''77''' :  3  (1997)  pp. 237–257</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  C.H. Papadimitriou,  M. Yannakakis,  "On limited nondeterminism and the complexity of VC-dimension"  ''J. Comput. Syst. Sci.'' , '''53''' :  2  (1996)  pp. 161–170</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top">  S. Ben-David,  N. Cesa-Bianchi,  D. Haussler,  P.M. Long,  "Characterizations of learnability of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/v/v130/v130010/v13001019.png" />-valued functions"  ''J. Comput. Syst. Sci.'' , '''50''' :  1  (1995)  pp. 74–86</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top">  S.B. Holden,  "Neural networks and the VC-dimension"  J.G. McWhirter (ed.) , ''Mathematics in Signal Processing'' , '''III''' , Oxford Univ. Press  (1994)  pp. 73–84</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top">  W. Maass,  "Perspectives of current research about the complexity of learning on neural nets"  V. Roychowdhury (ed.)  et al. (ed.) , ''Theoretical Advances in Neural Computation and Learning'' , Kluwer Acad. Publ.  (1994)  pp. 295–336</TD></TR><TR><TD valign="top">[a6]</TD> <TD valign="top">  D.Q. Naiman,  H.P. Wynn,  "Independence number and the complexity of families of sets"  ''Discr. Math.'' , '''154'''  (1996)  pp. 203–216</TD></TR><TR><TD valign="top">[a7]</TD> <TD valign="top">  J. Pach,  P.K. Agarwal,  "Combinatorial geometry" , Wiley/Interscience  (1995)  pp. 247–254</TD></TR></table>
+
<table>
 +
<TR><TD valign="top">[a1]</TD> <TD valign="top">  E. Kranakis,  D. Krizanc,  B. Ruf,  J. Urrutia,  G. Wöginger,  "The VC-dimension of set systems defined by graphs"  ''Discr. Appl. Math.'' , '''77''' :  3  (1997)  pp. 237–257</TD></TR>
 +
<TR><TD valign="top">[a2]</TD> <TD valign="top">  C.H. Papadimitriou,  M. Yannakakis,  "On limited nondeterminism and the complexity of VC-dimension"  ''J. Comput. Syst. Sci.'' , '''53''' :  2  (1996)  pp. 161–170</TD></TR>
 +
<TR><TD valign="top">[a3]</TD> <TD valign="top">  S. Ben-David,  N. Cesa-Bianchi,  D. Haussler,  P.M. Long,  "Characterizations of learnability of $\{0,\ldots,n\}$-valued functions"  ''J. Comput. Syst. Sci.'' , '''50''' :  1  (1995)  pp. 74–86 {{DOI|10.1006/jcss.1995.1008}} {{ZBL|0827.68095}}</TD></TR>
 +
<TR><TD valign="top">[a4]</TD> <TD valign="top">  S.B. Holden,  "Neural networks and the VC-dimension"  J.G. McWhirter (ed.) , ''Mathematics in Signal Processing'' , '''III''' , Oxford Univ. Press  (1994)  pp. 73–84</TD></TR>
 +
<TR><TD valign="top">[a5]</TD> <TD valign="top">  W. Maass,  "Perspectives of current research about the complexity of learning on neural nets"  V. Roychowdhury (ed.)  et al. (ed.) , ''Theoretical Advances in Neural Computation and Learning'' , Kluwer Acad. Publ.  (1994)  pp. 295–336</TD></TR>
 +
<TR><TD valign="top">[a6]</TD> <TD valign="top">  D.Q. Naiman,  H.P. Wynn,  "Independence number and the complexity of families of sets"  ''Discr. Math.'' , '''154'''  (1996)  pp. 203–216</TD></TR>
 +
<TR><TD valign="top">[a7]</TD> <TD valign="top">  J. Pach,  P.K. Agarwal,  "Combinatorial geometry" , Wiley/Interscience  (1995)  pp. 247–254</TD></TR>
 +
</table>
 +
 
 +
{{TEX|done}}

Latest revision as of 12:31, 11 December 2016

Vapnik–Červonenkis dimension, VC-dimension

Let $H = (V_H,E_H)$ be a hypergraph. The Vapnik–Chervonenkis dimension of $H$ is the largest cardinality of a subset $F$ of $V_H$ that is scattered by $E_H$, i.e. such that for all $A \subseteq F$ there is an $E \in E_H$ with $A = F \cap E$. Thus, it is the same as the index of a Vapnik–Chervonenkis class. It is usually denoted by $\mathrm{VC}(H)$.

Computing the Vapnik–Chervonenkis dimension is $\mathcal{NP}$-hard (cf. also $\mathcal{NP}$) for many classes of hypergraphs, [a1], [a2].

The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in probably approximately correct (PAC) learning. Thus, learnability of classes of $\{0,1\}$-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [a3].

For the role of the Vapnik–Chervonenkis dimension in neural networks, see, e.g., [a4], [a5].

The independence number of a hypergraph $H$ is the maximal cardinality of a subset $A$ of $V_H$ that does not contain any $E \in E_H$ (see also Graph, numerical characteristics of a). This notion is closely related with $\mathrm{VC}(H)$, [a6], [a7].

References

[a1] E. Kranakis, D. Krizanc, B. Ruf, J. Urrutia, G. Wöginger, "The VC-dimension of set systems defined by graphs" Discr. Appl. Math. , 77 : 3 (1997) pp. 237–257
[a2] C.H. Papadimitriou, M. Yannakakis, "On limited nondeterminism and the complexity of VC-dimension" J. Comput. Syst. Sci. , 53 : 2 (1996) pp. 161–170
[a3] S. Ben-David, N. Cesa-Bianchi, D. Haussler, P.M. Long, "Characterizations of learnability of $\{0,\ldots,n\}$-valued functions" J. Comput. Syst. Sci. , 50 : 1 (1995) pp. 74–86 DOI 10.1006/jcss.1995.1008 Zbl 0827.68095
[a4] S.B. Holden, "Neural networks and the VC-dimension" J.G. McWhirter (ed.) , Mathematics in Signal Processing , III , Oxford Univ. Press (1994) pp. 73–84
[a5] W. Maass, "Perspectives of current research about the complexity of learning on neural nets" V. Roychowdhury (ed.) et al. (ed.) , Theoretical Advances in Neural Computation and Learning , Kluwer Acad. Publ. (1994) pp. 295–336
[a6] D.Q. Naiman, H.P. Wynn, "Independence number and the complexity of families of sets" Discr. Math. , 154 (1996) pp. 203–216
[a7] J. Pach, P.K. Agarwal, "Combinatorial geometry" , Wiley/Interscience (1995) pp. 247–254
How to Cite This Entry:
Vapnik-Chervonenkis dimension. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Vapnik-Chervonenkis_dimension&oldid=11947
This article was adapted from an original article by M. Hazewinkel (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article