ON REDUCING DIMENSIONALITY OF DISSIMILARITY MATRICES FOR OPTIMIZING DBC - An Experimental Comparison

Sang-Woon Kim

2010

Abstract

One problem of dissimilarity-based classifications (DBCs) is the high dimensionality of dissimilarity matrices. To address this problem, two kinds of solutions have been proposed in the literature: prototype selection (PS) based methods and dimensionality reduction (DR) based methods. The DR-based method consists of building the dissimilarity matrices using all the available training samples and subsequently applying some of the standard DR schemes. On the other hand, the PS-based method works by directly choosing a small set of representatives from the training samples. Although DR-based and PS-based methods have been explored separately by many researchers, not much analysis has been done on the study of comparing the two. Therefore, this paper aims to find a suitable method for optimizing DBCs by a comparative study. In the experiments, four DR and four PS methods are used to reduce the dimensionality of the dissimilarity matrices, and classification accuracies of the resultant DBCs trained with two real-life benchmark databases are analyzed. Our empirical evaluation on the two approaches demonstrates that the DR-based method can improve the classification accuracies more than the PS-based method. Especially, the experimental results show that the DR-based method is clearly more useful for nonparametric classifiers, but not for parametric ones.

References

  1. Adini, Y. Moses, Y. and Ullman, S. (1997). Face recognition: The problem of compensating for changes in illumination direction. IEEE Trans. Pattern Anal. and Machine Intell., 19(7):721-732.
  2. Belhumeour, P. N. Hespanha, J. P. and Kriegman, D. J. (1997). Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. and Machine Intell., 19(7):711-720.
  3. Bicego, M. Murino, V. and Figueiredo, M. A. T. (2004). Smilarity-based classification of sequences using hidden markov models. Pattern Recognition, 37:2281- 2291.
  4. Bunke, H. and Riesen, K. (2007). A family of novel graph kernels for structural pattern recognition. Lecture Note in Computer Science, 4756:20-31.
  5. Georghiades, A. S. Belhumeur, P. N. and Kriegman, D. J. (2001). From few to many: Illumination cone models for face recognition under variable lighting and pose. IEEE Trans. Pattern Anal. and Machine Intell., 23(6):643-660.
  6. Kim, S. W. (2006). Optimizing dissimilarity-based classifiers using a newly modified hausdorff distance. Lecture Note on Artificial Intelligence, 4303:177-186.
  7. Kim, S. W. and Gao, J. (2008). On using dimensionality reduction schemes to optimize dissimilarity-based classifiers. Lecture Note in Computer Science, 5197:309- 316.
  8. Kim, S. W. and Oommen, B. J. (2007). On using prototype reduction schemes to optimize dissimilaritybased classification. Pattern Recognition, 40:2946- 2957.
  9. Laaksonen, J. and Oja, E. (1996). Subspace dimension selection and averaged learning subspace method in handwritten digit classification. In Proceedings of ICANN, pages 227-232, Bochum, Germany.
  10. Loog, M. and Duin, R. P. W. (2004). Linear dimensionality reduction via a heteroscedastic extension of lda: The cherno criterion. IEEE Trans. Pattern Anal. and Machine Intell., 26(6):732-739.
  11. Pekalska, E. and Duin, R. P. W. (2005). The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific Pub., Singapore.
  12. Pekalska, E. Duin, R. P. W. and Paclik, P. (2006). Prototype selection for dissimilarity-based classifiers. Pattern Recognition, 39:189-208.
  13. Riesen, K. Kilchherr, V. and Bunke, H. (2007). Reducing the dimensionality of vector space embeddings of graphs. Lecture Note on Artificial Intelligence, 4571:563-573.
  14. Rueda, L. and Herrera, M. (2008). Linear dimensionality reduction by maximizing the chernoff distance in the transformed space. Pattern Recognition, 41:3138- 3152.
  15. Sohn, S. Y. (1999). Meta analysis of classification algorithms for pattern. IEEE Trans. Pattern Anal. and Machine Intell., 21(11):1137-1144.
  16. Wei, C., L. Y. C. W. and Li, C. (2009). Trademark image retrieval using synthetic features for describing global shape and interior structure. Pattern Recognition, 42:386-394.
  17. Wilson, C. L. and Garris, M. D. (1992). Handprinted Character Database 3. National Institute of Standards and Technology, Gaithersburg, Maryland.
  18. Yu, H. and Yang, J. (2001). A direct lda algorithm for highdimensional data - with application to face recognition. Pattern Recognition, 34:2067-2070.
Download


Paper Citation


in Harvard Style

Kim S. (2010). ON REDUCING DIMENSIONALITY OF DISSIMILARITY MATRICES FOR OPTIMIZING DBC - An Experimental Comparison . In Proceedings of the 2nd International Conference on Agents and Artificial Intelligence - Volume 1: ICAART, ISBN 978-989-674-021-4, pages 235-240. DOI: 10.5220/0002713002350240


in Bibtex Style

@conference{icaart10,
author={Sang-Woon Kim},
title={ON REDUCING DIMENSIONALITY OF DISSIMILARITY MATRICES FOR OPTIMIZING DBC - An Experimental Comparison},
booktitle={Proceedings of the 2nd International Conference on Agents and Artificial Intelligence - Volume 1: ICAART,},
year={2010},
pages={235-240},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002713002350240},
isbn={978-989-674-021-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 2nd International Conference on Agents and Artificial Intelligence - Volume 1: ICAART,
TI - ON REDUCING DIMENSIONALITY OF DISSIMILARITY MATRICES FOR OPTIMIZING DBC - An Experimental Comparison
SN - 978-989-674-021-4
AU - Kim S.
PY - 2010
SP - 235
EP - 240
DO - 10.5220/0002713002350240