Globerson, A. and Roweis, S. T. (2005). Metric learning by
collapsing classes. In Advances in neural information
processing systems, pages 451–458.
Goldberger, J., Roweis, S., Hinton, G., and Salakhutdinov,
R. (2004). Neighbourhood components analysis. In
Advances in Neural Information Processing Systems
17.
He, X., Cai, D., Yan, S., and Zhang, H.-J. (2005). Neigh-
borhood preserving embedding. In Computer Vision
(ICCV), 10th IEEE International Conference on, vol-
ume 2, pages 1208–1213.
Hinton, G. E. and Salakhutdinov, R. R. (2006). Reducing
the dimensionality of data with neural networks. Sci-
ence, 313(5786):504–507.
Huang, C.-L. and Wang, C.-J. (2006). A GA-based fea-
ture selection and parameters optimizationfor support
vector machines. Expert Systems with Applications,
31(2):231 – 240.
Huang, G.-B., Zhu, Q.-Y., and Siew, C.-K. (2006). Extreme
learning machine: theory and applications. Neuro-
computing, 70(1):489–501.
Huang, H.-L. and Chang, F.-L. (2007). Esvm: Evolution-
ary support vector machine for automatic feature se-
lection and classification of microarray data. Biosys-
tems, 90(2):516 – 528.
Jain, A. K., Duin, R. P. W., and Mao, J. (2000). Statistical
pattern recognition: a review. Pattern Analysis and
Machine Intelligence, IEEE Transactions on, 22(1):4–
37.
Kim, H., Howland, P., and Park, H. (2005). Dimension re-
duction in text classification with support vector ma-
chines. In Journal of Machine Learning Research,
pages 37–53.
Ma, Y. and Fu, Y. (2011). Manifold Learning Theory and
Applications. CRC Press.
Ngiam, J., Coates, A., Lahiri, A., Prochnow, B., Le, Q. V.,
and Ng, A. Y. (2011). On optimization methods for
deep learning. In Proceedings of the 28th Interna-
tional Conference on Machine Learning (ICML-11),
pages 265–272.
Niyogi, X. (2004). Locality preserving projections. In Neu-
ral information processing systems, volume 16, page
153.
Pearson, K. (1901). On lines and planes of closest fit to
systems of points in space. The London, Edinburgh,
and Dublin Philosophical Magazine and Journal of
Science, 2(11):559–572.
Ranawana, R. and Palade, V. (2006). Multi-classifier sys-
tems: Review and a roadmap for developers. Interna-
tional Journal of Hybrid Intelligent Systems, 3(1):35–
61.
Sch¨olkopf, B., Smola, A., and M¨uller, K.-R. (1998). Non-
linear component analysis as a kernel eigenvalue prob-
lem. Neural computation, 10(5):1299–1319.
Spearman, C. (1904). “general intelligence”, objectively
determined and measured. The American Journal of
Psychology, 15(2):201–292.
Tenenbaum, J. B., De Silva, V., and Langford, J. C. (2000).
A global geometric framework for nonlinear dimen-
sionality reduction. Science, 290(5500):2319–2323.
Thornton, C., Hutter, F., Hoos, H. H., and Leyton-Brown,
K. (2013). Auto-WEKA: Combined selection and
hyperparameter optimization of classification algo-
rithms. In Proc. of KDD-2013, pages 847–855.
Van der Maaten, L. (2014). Matlab Toolbox for Dimension-
ality Reduction.
http://homepage.tudelft.nl/
19j49/Matlab_Toolbox_for_Dimensionality_
Reduction.html
.
Van der Maaten, L., Postma, E., and Van Den Herik, H.
(2009). Dimensionality reduction: A comparative re-
view. Journal of Machine Learning Research, 10:1–
41.
Weinberger, K. Q. and Saul, L. K. (2009). Distance metric
learning for large margin nearest neighbor classifica-
tion. Journal of Machine Learning Research, 10:207–
244.
Wolpert, D. H. (1996). The lack of a priori distinctions
between learning algorithms. Neural computation,
8(7):1341–1390.
Zhang, T., Yang, J., Zhao, D., and Ge, X. (2007). Linear
local tangent space alignment and application to face
recognition. Neurocomputing, 70(7):1547–1553.
APPENDIX
List of manifold learning methods that are used in the
framework, including references and abbreviations:
Principal Component Analysis (PCA) (Pearson,
1901), Kernel-PCA with polynomial and Gaussian
kernel (Sch¨olkopf et al., 1998), Denoising Autoen-
coder (Hinton and Salakhutdinov, 2006), Local Lin-
ear Embedding (LLE) (Donoho and Grimes, 2003),
Isomap (Tenenbaum et al., 2000), Manifold Chart-
ing (Brand, 2002), Laplacian Eigenmaps (Belkin and
Niyogi, 2001), Linear Local Tangent Space Align-
ment algorithm (LLTSA) (Zhang et al., 2007), Lo-
cality Preserving Projection (LPP) (Niyogi, 2004),
Neighborhood Preserving Embedding (NPE) (He
et al., 2005), Factor Analysis (Spearman, 1904),
Linear Discriminant Analysis (LDA) (Fisher, 1936),
Maximally Collapsing Metric Learning (MCML)
(Globerson and Roweis, 2005), Neighborhood Com-
ponents Analysis (NCA) (Goldberger et al., 2004),
Large-Margin Nearest Neighbor (LMNN) (Wein-
berger and Saul, 2009).
ICPRAM2015-InternationalConferenceonPatternRecognitionApplicationsandMethods
44