Biehl, M., Schneider, P., Smith, D., Stiekema, H., Tay-
lor, A., Hughes, B., Shackleton, C., Stewart, P., and
Arlt, W. (2012). Matrix relevance lvq in steroid
metabolomics based classification of adrenal tumors.
In ESANN.
Brown, E. T., Liu, J., Brodley, C. E., and Chang, R.
(2012). Dis-function: Learning distance functions in-
teractively. In Visual Analytics Science and Technol-
ogy (VAST), 2012 IEEE Conference on, pages 83–92.
IEEE.
Bunte, K., Biehl, M., and Hammer, B. (2012a). A general
framework for dimensionality reducing data visualiza-
tion mapping. Neural Computation, 24(3):771–804.
Bunte, K., Schneider, P., Hammer, B., Schleif, F.-M., Vill-
mann, T., and Biehl, M. (2012b). Limited rank matrix
learning, discriminative dimension reduction and vi-
sualization. Neural Networks, 26:159–173.
Committee on the Analysis of Massive Data, Committee on
Applied and Theoretical Statistics, Board on Mathe-
matical Sciences and Their Applications, Division on
Engineering and Physical Sciences, and National Re-
search Council (2013). Frontiers in Massive Data
Analysis. National Academic Press.
Efron, B., Hastie, T., Johnstone, I., and Tibshirani, R.
(2004). Least angle regression. Annals of Statistics,
32:407–499.
Endert, A., Fiaux, P., and North, C. (2012). Semantic inter-
action for visual text analytics. In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems, pages 473–482. ACM.
Gisbrecht, A. and Hammer, B. (2014). Data visualization
by nonlinear dimensionality reduction. WIREs Data
Mining and Knowledge Discovery.
Gisbrecht, A., Schulz, A., and Hammer, B. (2014). Para-
metric nonlinear dimensionality reduction using ker-
nel t-sne. Neurocomputing.
Goldberger, J., Roweis, S., Hinton, G., and Salakhutdinov,
R. (2004). Neighbourhood components analysis. In
Advances in Neural Information Processing Systems
17, pages 513–520. MIT Press.
Hammer, B., Gisbrecht, A., and Schulz, A. (2013). Appli-
cations of discriminative dimensionality reduction. In
ICPRAM.
Hammer, B., He, H., and Martinetz, T. (2014). Learning and
modeling big data. In Verleysen, M., editor, ESANN,
pages 343–352.
Jin, Y. and Hammer, B. (2014). Computational intelligence
in big data [guest editorial]. IEEE Comp. Int. Mag.,
9(3):12–13.
Khalil, T. (2012). Big data is a big deal. White House.
Lee, J. and Verleysen, M. (2009). Quality assessment of
dimensionality reduction: Rank-based criteria quality
assessment of dimensionality reduction: Rank-based
criteria quality assessment of dimensionality reduc-
tion: Rank-based criteria quality assessment of di-
mensionality reduction: rank-based criteria. Neuro-
computing, 72(7-9):1431–1443.
Lee, J. A., Renard, E., Bernard, G., Dupont, P., and Verley-
sen, M. (2013). Type 1 and 2 mixtures of kullback-
leibler divergences as cost functions in dimensional-
ity reduction based on similarity preservation. Neuro-
computing, 112:92–108.
Lee, J. A. and Verleysen, M. (2007). Nonlinear dimension-
ality reduction. Springer.
Lee, J. A. and Verleysen, M. (2010). Scale-independent
quality criteria for dimensionality reduction. Pattern
Recognition Letters, 31:2248–2257.
Mokbel, B., Paassen, B., and Hammer, B. (2014). Adaptive
distance measures for sequential data. In Verleysen,
M., editor, ESANN, pages 265–270.
Peltonen, J., Sandholm, M., and Kaski, S. (2013). Infor-
mation retrieval perspective to interactive data visual-
ization. In Hlawitschka, M. and Weinkauf, T., editors,
Proceedings of Eurovis 2013, The Eurographics Con-
ference on Visualization. The Eurographics Associa-
tion.
Riedmiller, M. and Braun, H. (1993). A direct adap-
tive method for faster backpropagation learning: The
rprop algorithm. In Proceedings of the IEEE Inter-
national Conference on Neural Networks, pages 586–
591. IEEE Press.
Roweis, S. T. and Saul, L. K. (2000). Nonlinear dimen-
sionality reduction by locally linear embedding. SCI-
ENCE, 290:2323–2326.
R¨uping, S. (2006). Learning Interpretable Models. PhD
thesis, Dortmund University.
Schulz, A., Gisbrecht, A., and Hammer, B. (2014). Rele-
vance learning for dimensonality reduction. In Ver-
leysen, M., editor, ESANN, pages 165–170.
Simoff, S. J., B¨ohlen, M. H., and Mazeika, A., editors
(2008). Visual Data Mining - Theory, Techniques and
Tools for Visual Analytics, volume 4404 of Lecture
Notes in Computer Science. Springer.
Tenenbaum, J., da Silva, V., and Langford, J. (2000). A
global geometric framework for nonlinear dimension-
ality reduction. Science, 290:2319–2323.
van der Maaten, L. and Hinton, G. (2008). Visualizing
high-dimensional data using t-sne. Journal of Ma-
chine Learning Research, 9:2579–2605.
van der Maaten, L., Postma, E., and van den Herik, H.
(2009). Dimensionality reduction: A comparative re-
view. Technical report, Tilburg University Technical
Report, TiCC-TR 2009-005.
Vellido, A., Martin-Guerroro, J., and Lisboa, P. (2012).
Making machine learning models interpretable. In
ESANN’12.
Venna, J., Peltonen, J., Nybo, K., Aidos, H., and Kaski, S.
(2010). Information retrieval perspective to nonlinear
dimensionality reduction for data visualization. Jour-
nal of Machine Learning Research, 11:451–490.
Ward, M., Grinstein, G., and Keim, D. A. (2010). Interac-
tive Data Visualization: Foundations, Techniques, and
Application. A. K. Peters, Ltd.
Yang, Z., Peltonen, J., and Kaski, S. (2013). Scalable opti-
mization of neighbor embedding for visualization. In
ICML (2), volume 28 of JMLR Proceedings, pages
127–135. JMLR.org.
Zhai, Y., Ong, Y.-S., and Tsang, I. (2014). The emerg-
ing ”big dimensionality”. Computational Intelligence
Magazine, IEEE, 9(3):14–26.
MetricLearninginDimensionalityReduction
239