most cases. Although IED-LLE can produce better
results than RLLE when k=4, k=8 and k=10, the
results of RLLE are usually better than these of IED-
LLE when k=18.
It can be seen from Table 1 to Table 2 and from
Figure 1 to Figure 4 that if a distance measure can
find good nearest neighbor candidates, it can also
produce good dimension reduction results combined
with the LLE method. It can also be seen from the
experimental results that using C-index can produce
consistent evaluation results as using the error-rate.
One of the benefits of using C-index is that no
clustering process is needed after the dimension
reduction, which may avoid the bias of the selected
clustering algorithm in the evaluation of the
dimension reduction results.
5 CONCLUSIONS
In this work, we use the Rank-order distance instead
of the traditional Euclidean distance to find the
nearest neighbors and then produce low-dimensional
representation using a similar process as in LLE. It is
shown that the proposed RLLE method can realize
dimension reduction more effectively on the two
image datasets compared to LLE and ISO-LLE, while
producing competitive results compared to IED-LLE.
It is also shown that the Rank-order distance can find
better neighbors than the Euclidean distance and the
geodesic distance for representing local
configurations of the manifolds. The experimental
results also show that C-index is another good
indicator for evaluating the dimension reduction
results. Our future work will focus on reducing the
time complexity in the computation of the Rank-order
distance.
ACKNOWLEDGEMENT
This work is supported by the National Natural
Science Foundation of China (Grants No. 61272213).
REFERENCES
Belkin, M. & Niyogi, P., (2003) Laplacian eigenmaps for
dimensionality reduction and data representation,
Neural computation, 15 (6), 1373-1396.
Comon, P., (1992). Independent component analysis,
Higher-Order Statistics, 29-38.
Ding, C., He, X, & Zha, H. et al., (2002). Adaptive
dimension reduction for clustering high dimensional
data. Proceeding of the 2002 IEEE International
Conference on Data Mining, pp. 147-154.
Fukunaga, K., (1990). Introduction to statistical pattern
recognition, Academic press.
He, L.M., Jin, W. & Yang, X.B. et al., (2013). An algorithm
research of supervised LLE based on mahalanobis
distance and extreme learning machine. In: Consumer
Electronics, Communications and Networks (CECNet),
2013 3rd International Conference on. IEEE, pp. 76-
79.
Hinton, G. E. & Roweis, S.T., (2002). Stochastic neighbor
embedding. In: Advances in neural information
processing systems. pp. 833-840.
Hubert, L. & Schultz, J., (1976). Quadratic assignment as a
general data analysis strategy, British Journal of
Mathematical and Statistical Psychology, 29 (2), 190-
241.
Hull, J. J., (1994). A database for handwritten text
recognition research, IEEE Transactions on Pattern
Analysis and Machine Intelligence, 16 (5), 550-554.
Kouropteva, O., Okun, O. & Pietikäinen, M., (2002).
Selection of the optimal parameter value for the locally
linear embedding algorithm, Proc. 1st Int. Conf. Fuzzy
Syst. Knowl. Discov., pp. 359 -363.
LeCun, Y., Bottou, L. & Bengio, Y. et al., (1998) Gradient-
based learning applied to document recognition.
Proceedings of the IEEE, 86 (11), 2278-2324.
Pan, Y. & Ge, S.S., (2009). A. Al Mamun, Weighted locally
linear embedding for dimension reduction, Pattern
Recognition, 42 (5), 798-811.
Roweis, S.T. & Saul, L.K., (2000) Nonlinear
dimensionality reduction by locally linear embedding,
Science, 290 (5500), 2323-2326.
Saul, L.K. & Roweis, S.T., (2003). Think globally, fit
locally: unsupervised learning of low dimensional
manifolds, The Journal of Machine Learning Research,
4, 119-155.
Tenenbaum, B. et al., (2000). A global geometric
framework for nonlinear dimensionality reduction,
Science, 290 (5500), 2319-2323.
van der Maaten, J.P.L., Postma, E.O. & van den Herik, H.J.,
(2009). Dimensionality reduction: A comparative
review, Tilburg Univ., Tilburg, The Netherlands, Tech.
Rep. TiCC-TR 2009-005.
Varini, C. A., (2006). Degenhard, T.W. Nattkemper,
ISOLLE: LLE with geodesic distance,
Neurocomputing, 69 (13), 1768-1771.
Wold, S., Esbensen, K. & Geladi, P., (1987). Principal
component analysis, Chemometrics and intelligent
laboratory systems, 2 (1), 37-52.
Zhang, L. & Wang, N., (2007). Locally linear embedding
based on image Euclidean distance. In: Automation and
Logistics, 2007 IEEE International Conference on.
IEEE, pp. 1914-1918.
Zhang, X.F. & Huang, S.B., (2012). Mahalanobis Distance
Measurement Based Locally Linear Embedding
Algorithm, Pattern Recognition and Artificial
Intelligence, 25, pp. 318-324.
Zhang, Z. & Wang, J., (2006). MLLE: Modified locally
linear embedding using multiple weights. In: Advances