structure. For example, the experiments considered in
(Liu et al., 2011b) cover the UMIST face, MovieLens
and the USPS datasets, which are known to contain
an underlying manifold structure. The problem is that
meaningful privileged information has to be found
for these problems. Secondly, the methods should
be compared against standard manifold classifiers to
check their performance. Finally, alternative kernel
methods apart from SVORIM could be considered to-
gether with the proposals in this paper.
ACKNOWLEDGEMENTS
This work has been subsidized by the TIN2011-22794
project of the Spanish Ministerial Commission of Sci-
ence and Technology (MICYT), FEDER funds and
the P11-TIC-7508 project of the “Junta de Andaluc
´
ıa”
(Spain).
REFERENCES
Asuncion, A. and Newman, D. (2007). UCI machine learn-
ing repository.
Baccianella, S., Esuli, A., and Sebastiani, F. (2009). Evalu-
ation measures for ordinal regression. In Proceedings
of the Ninth International Conference on Intelligent
Systems Design and Applications (ISDA 09), pages
283–287, Pisa, Italy.
Belkin, M. and Niyogi, P. (2001). Laplacian eigenmaps and
spectral techniques for embedding and clustering. In
NIPS, volume 14, pages 585–591.
Cardoso, J. S. and da Costa, J. F. P. (2007). Learning to clas-
sify ordinal data: The data replication method. Jour-
nal of Machine Learning Research, 8:1393–1429.
Cheng, J., Wang, Z., and Pollastri, G. (2008). A neural net-
work approach to ordinal regression. In Proceedings
of the IEEE International Joint Conference on Neu-
ral Networks (IJCNN2008, IEEE World Congress on
Computational Intelligence), pages 1279–1284. IEEE
Press.
Chu, W. and Ghahramani, Z. (2005). Gaussian processes
for ordinal regression. Journal of Machine Learning
Research, 6:1019–1041.
Chu, W. and Keerthi, S. S. (2007). Support vector ordinal
regression. Neural Computation, 19(3):792–815.
Cruz-Ram
´
ırez, M., Herv
´
as-Mart
´
ınez, C., S
´
anchez-
Monedero, J., and Guti
´
errez, P. A. (2014). Metrics
to guide a multi-objective evolutionary algorithm for
ordinal classification. Neurocomputing, 135:21–31.
Deng, W.-Y., Zheng, Q.-H., Lian, S., Chen, L., and Wang,
X. (2010). Ordinal extreme learning machine. Neuro-
computation, 74(1-3):447–456.
Dijkstra, E. W. (1959). A note on two problems in connex-
ion with graphs. Numerische mathematik, 1(1):269–
271.
Frank, E. and Hall, M. (2001). A simple approach to ordi-
nal classification. In Proc. of the 12th Eur. Conf. on
Machine Learning, pages 145–156.
Guti
´
errez, P. A., P
´
erez-Ortiz, M., Fernandez-Navarro,
F., S
´
anchez-Monedero, J., and Herv
´
as-Mart
´
ınez, C.
(2012). An Experimental Study of Different Ordi-
nal Regression Methods and Measures. In 7th Inter-
national Conference on Hybrid Artificial Intelligence
Systems (HAIS), volume 7209 of Lecture Notes in
Computer Science, pages 296–307.
He, X. and Niyogi, P. (2003). Locality preserving projec-
tions. In NIPS, volume 16, pages 234–241.
Kira, K. and Rendell, L. A. (1992). The feature selection
problem: Traditional methods and a new algorithm.
In AAAI, pages 129–134.
Li, L. and Lin, H.-T. (2007). Ordinal Regression by Ex-
tended Binary Classification. In Advances in Neural
Inform. Processing Syst. 19.
Lin, H.-T. and Li, L. (2012). Reduction from cost-sensitive
ordinal ranking to weighted binary classification. Neu-
ral Computation, 24(5):1329–1367.
Liu, Y., Liu, Y., and Chan, K. C. C. (2011a). Ordinal regres-
sion via manifold learning. In Burgard, W. and Roth,
D., editors, Proceedings of the 25th AAAI Conference
on Artificial Intelligence (AAAI’11), pages 398–403.
AAAI Press.
Liu, Y., Liu, Y., Chan, K. C. C., and Zhang, J. (2012).
Neighborhood preserving ordinal regression. In Pro-
ceedings of the 4th International Conference on Inter-
net Multimedia Computing and Service (ICIMCS12),
pages 119–122, New York, NY, USA. ACM.
Liu, Y., Liu, Y., Zhong, S., and Chan, K. C. (2011b).
Semi-supervised manifold ordinal regression for im-
age ranking. In Proceedings of the 19th ACM inter-
national conference on Multimedia (ACM MM2011),
pages 1393–1396, New York, NY, USA. ACM.
McCullagh, P. (1980). Regression models for ordinal data.
Journal of the Royal Statistical Society, 42(2):109–
142.
PASCAL (2011). Pascal (pattern analysis, statistical mod-
elling and computational learning) machine learning
benchmarks repository.
R. Herbrich, T. G. and Obermayer, K. (2000). Large mar-
gin rank boundaries for ordinal regression. In Smola,
A., Bartlett, P., Sch
¨
olkopf, B., and Schuurmans, D.,
editors, Advances in Large Margin Classifiers, pages
115–132. MIT Press.
Shashua, A. and Levin, A. (2003). Ranking with large mar-
gin principle: Two approaches. In Advances in Neural
Information Processing Systems (NIPS), pages 937–
944. MIT Press, Cambridge.
Sun, B.-Y., Li, J., Wu, D. D., Zhang, X.-M., and Li, W.-B.
(2010). Kernel discriminant learning for ordinal re-
gression. IEEE Transactions on Knowledge and Data
Engineering, 22:906–910.
Tenenbaum, J. B., De Silva, V., and Langford, J. C. (2000).
A global geometric framework for nonlinear dimen-
sionality reduction. Science, 290(5500):2319–2323.
Vapnik, V. and Vashist, A. (2009). A new learning
IncorporatingPrivilegedInformationtoImproveManifoldOrdinalRegression
193