PROBABILISTIC ESTIMATION OF VAPNIK-CHERVONENKIS DIMENSION

Przemyslaw Klesk

Abstract

We present an idea of probabilistic estimation of Vapnik-Chervonenkis dimension given a set of indicator functions. The idea is embedded in two algorithms we propose --- named A and A. Both algorithms are based on an approach that can be described as 'expand or divide and conquer'. Also, algorithms are parametrized by probabilistic constraints expressed in a form of (epsilon, delta)-precision. The precision implies how often and by how much the estimate can deviate from the true VC-dimension. Analysis of convergence and computational complexity for proposed algorithms is also presented.

References

  1. Anthony, M. and Bartlett, P. (2009). Neural Network Learning: Theoretical Foundations. Cambridge University Press, Cambridge, UK.
  2. Bartlett, P., Kulkarni, S., and Posner, S. (1997). Covering numbers for real-valued function classes. IEEE Transactions on Information Theory, 47:1721-1724.
  3. Cherkassky, V. and Mulier, F. (1998). Learning from data. John Wiley & Sons, inc.
  4. Graham, R., Knuth, D., and Patashik, O. (2002). Concrete Mathematics. A foundation for Computer Science. Wydawnictwo Naukowe PWN SA, Warsaw, Poland.
  5. Hellman, M. and Raviv, J. (1970). Probability of error, equivocation and the chernoff bound. IEEE Transactions on Information Theory, IT-16(4):368-372.
  6. Papadimitriou, C. and Yannakakis, M. (1996). On limited nondeterminism and the complexity of the V-C dimension. Journal of Computer and System Sciences, 53:161-170.
  7. Schmidt, J., Siegel, A., and Srinivasan, A. (1995). Chernoff-hoeffding bounds for applications with limited independence. SIAM Journal on Discrete Mathematics, 8(2):223-250.
  8. Vapnik, V. (1995). The Nature of Statistical Learning Theory. Springer Verlag, New York.
  9. Vapnik, V. (1998). Statistical Learning Theory: Inference from Small Samples. Wiley, New York.
  10. Vapnik, V. and Chervonenkis, A. (1968). On the uniform convergence of relative frequencies of events to their probabilities. Dokl. Aka. Nauk, 181.
  11. Vapnik, V. and Chervonenkis, A. (1989). The necessary and sufficient conditions for the consistency of the method of empirical risk minimization. Yearbook of the Academy of Sciences of the USSR on Recognition, Classification and Forecasting, 2:217-249.
  12. Wenocur, R. and Dudley, R. (1981). Some special VapnikChervonenkis classes. Discrete Mathematics, 33:313- 318.
  13. Zhang, T. (2002). Covering number bounds of certain regularized linear function classes. Journal of Machine Learning Research, 2:527-550.
Download


Paper Citation


in Harvard Style

Klesk P. (2012). PROBABILISTIC ESTIMATION OF VAPNIK-CHERVONENKIS DIMENSION . In Proceedings of the 4th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART, ISBN 978-989-8425-95-9, pages 262-270. DOI: 10.5220/0003721702620270


in Bibtex Style

@conference{icaart12,
author={Przemyslaw Klesk},
title={PROBABILISTIC ESTIMATION OF VAPNIK-CHERVONENKIS DIMENSION},
booktitle={Proceedings of the 4th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART,},
year={2012},
pages={262-270},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003721702620270},
isbn={978-989-8425-95-9},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 4th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART,
TI - PROBABILISTIC ESTIMATION OF VAPNIK-CHERVONENKIS DIMENSION
SN - 978-989-8425-95-9
AU - Klesk P.
PY - 2012
SP - 262
EP - 270
DO - 10.5220/0003721702620270