Difference between revisions of "Vapnik-Chervonenkis dimension"
Ulf Rehmann (talk | contribs) m (moved Vapnik–Chervonenkis dimension to Vapnik-Chervonenkis dimension: ascii title) |
(Tex done) |
||
Line 1: | Line 1: | ||
''Vapnik–Červonenkis dimension, VC-dimension'' | ''Vapnik–Červonenkis dimension, VC-dimension'' | ||
− | Let | + | Let $H = (V_H,E_H)$ be a [[hypergraph]]. The Vapnik–Chervonenkis dimension of $H$ is the largest cardinality of a subset $F$ of $V_H$ that is scattered by $E_H$, i.e. such that for all $A \subseteq F$ there is an $E \in E_H$ with $A = F \cap E$. Thus, it is the same as the index of a [[Vapnik–Chervonenkis class]]. It is usually denoted by $\mathrm{VC}(H)$. |
− | Computing the Vapnik–Chervonenkis dimension is | + | Computing the Vapnik–Chervonenkis dimension is $\mathcal{NP}$-hard (cf. also [[NP|$\mathcal{NP}$]]) for many classes of hypergraphs, [[#References|[a1]]], [[#References|[a2]]]. |
− | The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in | + | The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in probably approximately correct ([[PAC]]) learning. Thus, learnability of classes of $\{0,1\}$-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [[#References|[a3]]]. |
− | For the role of the Vapnik–Chervonenkis dimension in neural | + | For the role of the Vapnik–Chervonenkis dimension in [[neural network]]s, see, e.g., [[#References|[a4]]], [[#References|[a5]]]. |
− | The independence number of a hypergraph | + | The independence number of a hypergraph $H$ is the maximal cardinality of a subset $A$ of $V_H$ that does not contain any $E \in E_H$ (see also [[Graph, numerical characteristics of a]]). This notion is closely related with $\mathrm{VC}(H)$, [[#References|[a6]]], [[#References|[a7]]]. |
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> E. Kranakis, D. Krizanc, B. Ruf, J. Urrutia, G. Wöginger, "The VC-dimension of set systems defined by graphs" ''Discr. Appl. Math.'' , '''77''' : 3 (1997) pp. 237–257</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> C.H. Papadimitriou, M. Yannakakis, "On limited nondeterminism and the complexity of VC-dimension" ''J. Comput. Syst. Sci.'' , '''53''' : 2 (1996) pp. 161–170</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> S. Ben-David, N. Cesa-Bianchi, D. Haussler, P.M. Long, "Characterizations of learnability of | + | <table> |
+ | <TR><TD valign="top">[a1]</TD> <TD valign="top"> E. Kranakis, D. Krizanc, B. Ruf, J. Urrutia, G. Wöginger, "The VC-dimension of set systems defined by graphs" ''Discr. Appl. Math.'' , '''77''' : 3 (1997) pp. 237–257</TD></TR> | ||
+ | <TR><TD valign="top">[a2]</TD> <TD valign="top"> C.H. Papadimitriou, M. Yannakakis, "On limited nondeterminism and the complexity of VC-dimension" ''J. Comput. Syst. Sci.'' , '''53''' : 2 (1996) pp. 161–170</TD></TR> | ||
+ | <TR><TD valign="top">[a3]</TD> <TD valign="top"> S. Ben-David, N. Cesa-Bianchi, D. Haussler, P.M. Long, "Characterizations of learnability of $\{0,\ldots,n\}$-valued functions" ''J. Comput. Syst. Sci.'' , '''50''' : 1 (1995) pp. 74–86 {{DOI|10.1006/jcss.1995.1008}} {{ZBL|0827.68095}}</TD></TR> | ||
+ | <TR><TD valign="top">[a4]</TD> <TD valign="top"> S.B. Holden, "Neural networks and the VC-dimension" J.G. McWhirter (ed.) , ''Mathematics in Signal Processing'' , '''III''' , Oxford Univ. Press (1994) pp. 73–84</TD></TR> | ||
+ | <TR><TD valign="top">[a5]</TD> <TD valign="top"> W. Maass, "Perspectives of current research about the complexity of learning on neural nets" V. Roychowdhury (ed.) et al. (ed.) , ''Theoretical Advances in Neural Computation and Learning'' , Kluwer Acad. Publ. (1994) pp. 295–336</TD></TR> | ||
+ | <TR><TD valign="top">[a6]</TD> <TD valign="top"> D.Q. Naiman, H.P. Wynn, "Independence number and the complexity of families of sets" ''Discr. Math.'' , '''154''' (1996) pp. 203–216</TD></TR> | ||
+ | <TR><TD valign="top">[a7]</TD> <TD valign="top"> J. Pach, P.K. Agarwal, "Combinatorial geometry" , Wiley/Interscience (1995) pp. 247–254</TD></TR> | ||
+ | </table> | ||
+ | |||
+ | {{TEX|done}} |
Latest revision as of 12:31, 11 December 2016
Vapnik–Červonenkis dimension, VC-dimension
Let $H = (V_H,E_H)$ be a hypergraph. The Vapnik–Chervonenkis dimension of $H$ is the largest cardinality of a subset $F$ of $V_H$ that is scattered by $E_H$, i.e. such that for all $A \subseteq F$ there is an $E \in E_H$ with $A = F \cap E$. Thus, it is the same as the index of a Vapnik–Chervonenkis class. It is usually denoted by $\mathrm{VC}(H)$.
Computing the Vapnik–Chervonenkis dimension is $\mathcal{NP}$-hard (cf. also $\mathcal{NP}$) for many classes of hypergraphs, [a1], [a2].
The Vapnik–Chervonenkis dimension plays an important role in learning theory, especially in probably approximately correct (PAC) learning. Thus, learnability of classes of $\{0,1\}$-valued functions is equivalent to finiteness of the Vapnik–Chervonenkis dimension, [a3].
For the role of the Vapnik–Chervonenkis dimension in neural networks, see, e.g., [a4], [a5].
The independence number of a hypergraph $H$ is the maximal cardinality of a subset $A$ of $V_H$ that does not contain any $E \in E_H$ (see also Graph, numerical characteristics of a). This notion is closely related with $\mathrm{VC}(H)$, [a6], [a7].
References
[a1] | E. Kranakis, D. Krizanc, B. Ruf, J. Urrutia, G. Wöginger, "The VC-dimension of set systems defined by graphs" Discr. Appl. Math. , 77 : 3 (1997) pp. 237–257 |
[a2] | C.H. Papadimitriou, M. Yannakakis, "On limited nondeterminism and the complexity of VC-dimension" J. Comput. Syst. Sci. , 53 : 2 (1996) pp. 161–170 |
[a3] | S. Ben-David, N. Cesa-Bianchi, D. Haussler, P.M. Long, "Characterizations of learnability of $\{0,\ldots,n\}$-valued functions" J. Comput. Syst. Sci. , 50 : 1 (1995) pp. 74–86 DOI 10.1006/jcss.1995.1008 Zbl 0827.68095 |
[a4] | S.B. Holden, "Neural networks and the VC-dimension" J.G. McWhirter (ed.) , Mathematics in Signal Processing , III , Oxford Univ. Press (1994) pp. 73–84 |
[a5] | W. Maass, "Perspectives of current research about the complexity of learning on neural nets" V. Roychowdhury (ed.) et al. (ed.) , Theoretical Advances in Neural Computation and Learning , Kluwer Acad. Publ. (1994) pp. 295–336 |
[a6] | D.Q. Naiman, H.P. Wynn, "Independence number and the complexity of families of sets" Discr. Math. , 154 (1996) pp. 203–216 |
[a7] | J. Pach, P.K. Agarwal, "Combinatorial geometry" , Wiley/Interscience (1995) pp. 247–254 |
Vapnik-Chervonenkis dimension. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Vapnik-Chervonenkis_dimension&oldid=39971