MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE

Xiuling Zhou, Ping Guo, C. L. Philip Chen

2011

Abstract

Regularization is a solution to solve the problem of unstable estimation of covariance matrix with a small sample set in Gaussian classifier. And multi-regularization parameters estimation is more difficult than single parameter estimation. In this paper, KLIM_L covariance matrix estimation is derived theoretically based on MDL (minimum description length) principle for the small sample problem with high dimension. KLIM_L is a generalization of KLIM (Kullback-Leibler information measure) which considers the local difference in each dimension. Under the framework of MDL principle, multi-regularization parameters are selected by the criterion of minimization the KL divergence and estimated simply and directly by point estimation which is approximated by two-order Taylor expansion. It costs less computation time to estimate the multi-regularization parameters in KLIM_L than in RDA (regularized discriminant analysis) and in LOOC (leave-one-out covariance matrix estimate) where cross validation technique is adopted. And higher classification accuracy is achieved by the proposed KLIM_L estimator in experiment.

References

  1. Bishop, C. M., 2007. Pattern recognition and machine learning, Springer-Verlag New York, Inc. Secaucus, NJ, USA.
  2. Everitt, B. S., Hand, D., 1981. Finite Mixture Distributions, Chapman and Hall, London.
  3. Friedman, J. H., 1989. Regularized discriminant analysis, Journal of the American Statistical Association, vol. 84, no. 405, 165-175.
  4. Hoffbeck, J. P. and Landgrebe, D. A., 1996. Covariance matrix estimation and classification with limited training data, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 7, 763-767.
  5. Schafer, J. and Strimmer, K., 2005. A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics, Statistical Applications in Genetics and Molecular Biology, vol. 4, no. 1.
  6. Srivastava, S., Gupta, M. R., Frigyik, B. A., 2007. Bayesian quadratic discriminant analysis, J. Mach. Learning Res. 8, 1277-1305.
  7. Bickel, P. J. and Levina, E., 2008. Regularized estimation of large covariance matrices, Annals of Statistics, vol. 36, no. 1, 199-227.
  8. Friedman, J., Hastie, T., and Tibshirani, R., 2008. Sparse inverse covariance estimation with the graphical lasso, Biostatistics, vol. 9, no. 3, 432-441.
  9. Cao, G., Bachega, L. R., Bouman, C. A., 2011. The Sparse Matrix Transform for Covariance Estimation and Analysis of High Dimensional Signals. IEEE Transactions on Image Processing, Volume 20, Issue 3, 625 - 640.
  10. Rivals, I., Personnaz, L., 1999. On cross validation for model selection, Neural Comput. 11,863-870.
  11. Guo, P., Jia, Y., and Lyu, M. R., 2008. A study of regularized Gaussian classifier in high-dimension small sample set case based on MDL principle with application to spectrum recognition, Pattern Recognition, Vol. 41, 28422854.
  12. Redner, R. A., Walker, H. F., 1984. Mixture densities, maximum likelihood and the EM algorithm, SIAM Rev. 26, 195-239.
  13. Aeberhard, S., Coomans, de Vel, D., O., 1994. Comparative analysis of statistical pattern recognition methods in high dimensional settings, Pattern Recognition 27 (8), 1065-1077.
  14. Rissanen, J., 1978. Modeling by shortest data description, Automatica 14, 465-471.
  15. Barron, A., Rissanen, J., Yu, B., 1998. The minimum description length principle in coding and modeling, IEEE Trans. Inform. Theory 44 (6), 2743-2760.
  16. Kullback, S., 1959. Information Theory and Statistics, Wiley, New York.
  17. Nene, S. A., Nayar, S. K. and Murase, H., 1996. Columbia Object Image library(COIL-20). Technical report CUCS-005-96.
Download


Paper Citation


in Harvard Style

Zhou X., Guo P. and Chen C. (2011). MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011) ISBN 978-989-8425-84-3, pages 112-117. DOI: 10.5220/0003669301120117


in Bibtex Style

@conference{ncta11,
author={Xiuling Zhou and Ping Guo and C. L. Philip Chen},
title={MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)},
year={2011},
pages={112-117},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003669301120117},
isbn={978-989-8425-84-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)
TI - MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE
SN - 978-989-8425-84-3
AU - Zhou X.
AU - Guo P.
AU - Chen C.
PY - 2011
SP - 112
EP - 117
DO - 10.5220/0003669301120117