Author:
Oliver Kramer
Affiliation:
Carl von Ossietzky Universität Oldenburg, Germany
Keyword(s):
Non-linear dimensionality reduction, Manifold learning, Unsupervised regression, K-nearest neighbor regression.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computational Intelligence
;
Data Manipulation
;
Evolutionary Computing
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Knowledge Discovery and Information Retrieval
;
Knowledge-Based Systems
;
Machine Learning
;
Methodologies and Methods
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Soft Computing
;
Symbolic Systems
Abstract:
In many scientific disciplines structures in high-dimensional data have to be detected, e.g., in stellar spectra, in genome data, or in face recognition tasks. We present an approach to non-linear dimensionality reduction based on fitting nearest neighbor regression to the unsupervised regression framework for learning of lowdimensional manifolds. The problem of optimizing latent neighborhoods is difficult to solve, but the UNN formulation allows an efficient strategy of iteratively embedding latent points to fixed neighborhood topologies.
The choice of an appropriate loss function is relevant, in particular for noisy, and high-dimensional data spaces. We extend unsupervised nearest neighbor (UNN) regression by the e-insensitive loss, which allows to ignore residuals under a threshold defined by e. In the experimental part of this paper we test the influence of e on the final data space reconstruction error, and present a visualization of UNN embeddings on test data sets.