Applications of Discriminative Dimensionality Reduction

Barbara Hammer, Andrej Gisbrecht, Alexander Schulz

Abstract

Discriminative nonlinear dimensionality reduction aims at a visualization of a given set of data such that the information contained in the data points which is of particular relevance for a given class labeling is displayed. We link this task to an integration of the Fisher information, and we discuss its difference from supervised classification. We present two potential application areas: speed-up of unsupervised nonlinear visualization by integration of prior knowledge, and visualization of a given classifier such as an SVM in low dimensions.

References

  1. Baudat, G. and Anouar, F. (2000). Generalized discriminant analysis using a kernel approach. Neural Computation, 12:2385-2404.
  2. Bekkerman, R., Bilenko, M., and Langford, J., editors (2011). Scaling up Machine Learning. Cambridge University Press.
  3. Biehl, M., Hammer, B., Merényi, E., Sperduti, A., and Villmann, T., editors (2011). Learning in the context of very high dimensional data (Dagstuhl Seminar 11341), volume 1.
  4. Braun, M. L., Buhmann, J. M., and Müller, K.-R. (2008). On relevant dimensions in kernel feature spaces. J. Mach. Learn. Res., 9:1875-1908.
  5. Bunte, K., Biehl, M., and Hammer, B. (2012a). A general framework for dimensionality reducing data visualization mapping. Neural Computation, 24(3):771-804.
  6. Bunte, K., Schneider, P., Hammer, B., Schleif, F.-M., Villmann, T., and Biehl, M. (2012b). Limited rank matrix learning, discriminative dimension reduction and visualization. Neural Networks, 26:159-173.
  7. Caragea, D., Cook, D., Wickham, H., and Honavar, V. (2008). Visual methods for examining svm classifiers. In Simoff, S. J., Böhlen, M. H., and Mazeika, A., editors, Visual Data Mining, volume 4404 of Lecture Notes in Computer Science, pages 136-153. Springer.
  8. Cohn, D. (2003). Informed projections. In Becker, S., Thrun, S., and Obermayer, K., editors, NIPS, pages 849-856. MIT Press.
  9. Geng, X., Zhan, D.-C., and Zhou, Z.-H. (2005). Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 35(6):1098-1107.
  10. Gisbrecht, A., Mokbel, B., and Hammer, B. (2013). Linear basis-function t-sne for fast nonlinear dimensionality reduction. In IJCNN.
  11. Goldberger, J., Roweis, S., Hinton, G., and Salakhutdinov, R. (2004). Neighbourhood components analysis. In Advances in Neural Information Processing Systems 17, pages 513-520. MIT Press.
  12. Hastie, T., Tibshirani, R., and Friedman, J. (2001). The Elements of Statistical Learning. Springer Series in Statistics. Springer New York Inc., New York, NY, USA.
  13. Hernandez-Orallo, J., Flach, P., and Ferri, C. (2011). Brier curves: a new cost-based visualisation of classifier performance. In International Conference on Machine Learning.
  14. Iwata, T., Saito, K., Ueda, N., Stromsten, S., Griffiths, T. L., and Tenenbaum, J. B. (2007). Parametric embedding for class visualization. Neural Computation, 19(9):2536-2556.
  15. Jakulin, A., Moz?ina, M., Dems?ar, J., Bratko, I., and Zupan, B. (2005). Nomograms for visualizing support vector machines. In Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining, KDD 7805, pages 108-117, New York, NY, USA. ACM.
  16. Kaski, S., Sinkkonen, J., and Peltonen, J. (2001). Bankruptcy analysis with self-organizing maps in learning metrics. IEEE Transactions on Neural Networks, 12:936-947.
  17. Lee, J. A. and Verleysen, M. (2007). Nonlinear Dimensionality Reduction. Springer.
  18. Lee, J. A. and Verleysen, M. (2010). Scale-independent quality criteria for dimensionality reduction. Pattern Recognition Letters, 31:2248-2257.
  19. Ma, B., Qu, H., and Wong, H. (2007). Kernel clusteringbased discriminant analysis. Pattern Recognition, 40(1):324-327.
  20. Maaten, L. V. D. and Hinton, G. (2008). Visualizing highdimensional data using t-sne. Journal of Machine Learning Research, 9:2579-2605.
  21. Memisevic, R. and Hinton, G. (2005). Multiple relational embedding. In Saul, L. K., Weiss, Y., and Bottou, L., editors, Advances in Neural Information Processing Systems 17, pages 913-920. MIT Press, Cambridge, MA.
  22. Mika, S., Rätsch, G., Weston, J., Sch ölkopf, B., and Müller, K.-R. (1999). Fisher discriminant analysis with kernels. In Neural Networks for Signal Processing IX, 1999. Proceedings of the 1999 IEEE Signal Processing Society Workshop, pages 41-48. IEEE.
  23. Peltonen, J., Klami, A., and Kaski, S. (2004). Improved learning of riemannian metrics for exploratory analysis. Neural Networks, 17:1087-1100.
  24. Poulet, F. (2005). Visual svm. In Chen, C.-S., Filipe, J., Seruca, I., and Cordeiro, J., editors, ICEIS (2), pages 309-314.
  25. Rüping, S. (2006). Learning Interpretable Models. PhD thesis, Dortmund University.
  26. Tsang, I. W., Kwok, J. T., ming Cheung, P., and Cristianini, N. (2005). Core vector machines: Fast svm training on very large data sets. Journal of Machine Learning Research, 6:363-392.
  27. Vellido, A., Martin-Guerroro, J., and Lisboa, P. (2012). Making machine learning models interpretable. In ESANN'12.
  28. Venna, J., Peltonen, J., Nybo, K., Aidos, H., and Kaski, S. (2010). Information retrieval perspective to nonlinear dimensionality reduction for data visualization. Journal of Machine Learning Research, 11:451-490.
  29. Wang, X., Wu, S., Wang, X., and Li, Q. (2006). Svmv - a novel algorithm for the visualization of svm classification results. In Wang, J., Yi, Z., Zurada, J., Lu, B.-L., and Yin, H., editors, Advances in Neural Networks - ISNN 2006, volume 3971 of Lecture Notes in Computer Science, pages 968-973. Springer Berlin / Heidelberg.
  30. Ward, M., Grinstein, G., and Keim, D. A. (2010). Interactive Data Visualization: Foundations, Techniques, and Application. A. K. Peters, Ltd.
  31. Witten, D. M. and Tibshirani, R. (2011). Supervised multidimensional scaling for visualization, classification, and bipartite ranking. Computational Statistics and Data Analysis, 55(1):789 - 801.
Download


Paper Citation


in Harvard Style

Hammer B., Gisbrecht A. and Schulz A. (2013). Applications of Discriminative Dimensionality Reduction . In Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-8565-41-9, pages 33-41. DOI: 10.5220/0004245300330041


in Bibtex Style

@conference{icpram13,
author={Barbara Hammer and Andrej Gisbrecht and Alexander Schulz},
title={Applications of Discriminative Dimensionality Reduction},
booktitle={Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2013},
pages={33-41},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004245300330041},
isbn={978-989-8565-41-9},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - Applications of Discriminative Dimensionality Reduction
SN - 978-989-8565-41-9
AU - Hammer B.
AU - Gisbrecht A.
AU - Schulz A.
PY - 2013
SP - 33
EP - 41
DO - 10.5220/0004245300330041