MIXTURES OF GAUSSIAN DISTRIBUTIONS UNDER LINEAR DIMENSIONALITY REDUCTION

Ahmed Otoom, Oscar Perez Concha, Massimo Piccardi

Abstract

High dimensional spaces pose a serious challenge to the learning process. It is a combination of limited number of samples and high dimensions that positions many problems under the “curse of dimensionality”, which restricts severely the practical application of density estimation. Many techniques have been proposed in the past to discover embedded, locally-linear manifolds of lower dimensionality, including the mixture of Principal Component Analyzers, the mixture of Probabilistic Principal Component Analyzers and the mixture of Factor Analyzers. In this paper, we present a mixture model for reducing dimensionality based on a linear transformation which is not restricted to be orthogonal. Two methods are proposed for the learning of all the transformations and mixture parameters: the first method is based on an iterative maximum-likelihood approach and the second is based on random transformations and fixed (non iterative) probability functions. For experimental validation, we have used the proposed model for maximum-likelihood classification of five “hard” data sets including data sets from the UCI repository and the authors’ own. Moreover, we compared the classification performance of the proposed method with that of other popular classifiers including the mixture of Probabilistic Principal Component Analyzers and the Gaussian mixture model. In all cases but one, the accuracy achieved by the proposed method proved the highest, with increases with respect to the runner-up ranging from 0.2% to 5.2%.

References

  1. Asuncion, A. and Newman, D. (2007). UCI machine learning repository.
  2. Bartholomew, D. J. (1987). Latent Variable Models and Factor Analysis. Charles Griffin & Co. Ltd., London.
  3. Bellman, R. (1961). Adaptive control processes - A guided tour. Princeton University Press, Princeton, New Jersey.
  4. Bingham, E. and Mannila, H. (2001). Random projection in dimensionality reduction: applications to image and text data. In Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-2001), pages 245-250.
  5. Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.
  6. Bolton, R. J. and Krzanowski, W. J. (1999). A characterization of principal components for projection pursuit. The American Statistician, 53(2):108-109.
  7. Breiman, L. and Spector, P. (1992). Submodel selection and evaluation in regression: The x-random case. International Statistical Review, 60(3):291-319.
  8. Fodor, I. (2002). A survey of dimension reduction techniques. Technical Report UCRL-ID-148494, Lawrence Livermore National Laboratory.
  9. Ghahramani, Z. and Hinton, G. (1997). The EM algorithm for mixtures of factor analyzers. Technical Report CRG-TR-96-1, University of Toronto.
  10. Hinton, G. E., Dayan, P., and Revow, M. (1997). Modeling the manifolds of images of handwritten digits. IEEE Transactions on Neural Networks, 8(1):65-74.
  11. Johnson, W. B. and Lindenstrauss, J. (1984). Extensions of lipschitz mappings into a hilbert space. In Conference in modern analysis and probability, Contemporary Math, volume 26, pages 189-206.
  12. Kaski, S. (1998). Dimensionality reduction by random mapping: Fast similarity computation for clustering. In Proceedings of IJCNN'98, International Joint Conference on Neural Networks, volume 1, pages 413- 418. IEEE Service Center.
  13. Kittler, J. (1998). Combining classifiers: A theoretical framework. Pattern Analysis and Applications, 1(1):18-27.
  14. Otoom, A. e. a. (2007). Towards automatic abandoned object classification in visual surveillance systems. In Asia-Pacific Workshop on Visual Information Processing, pages 143-149, Tanian, Taiwan.
  15. Tipping, M. E. and Bishop, C. M. (1999a). Mixtures of probabilistic principal component analyzers. Neural Computation, 11(2):443-482.
  16. Tipping, M. E. and Bishop, C. M. (1999b). Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology),, 61(3):611-622.
Download


Paper Citation


in Harvard Style

Otoom A., Perez Concha O. and Piccardi M. (2010). MIXTURES OF GAUSSIAN DISTRIBUTIONS UNDER LINEAR DIMENSIONALITY REDUCTION . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2010) ISBN 978-989-674-029-0, pages 511-518. DOI: 10.5220/0002844005110518


in Bibtex Style

@conference{visapp10,
author={Ahmed Otoom and Oscar Perez Concha and Massimo Piccardi},
title={MIXTURES OF GAUSSIAN DISTRIBUTIONS UNDER LINEAR DIMENSIONALITY REDUCTION},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2010)},
year={2010},
pages={511-518},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002844005110518},
isbn={978-989-674-029-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2010)
TI - MIXTURES OF GAUSSIAN DISTRIBUTIONS UNDER LINEAR DIMENSIONALITY REDUCTION
SN - 978-989-674-029-0
AU - Otoom A.
AU - Perez Concha O.
AU - Piccardi M.
PY - 2010
SP - 511
EP - 518
DO - 10.5220/0002844005110518