Sparse Least Squares Twin Support Vector Machines with
Manifold-preserving Graph Reduction
Xijiong Xie
The School of Information Science and Engineering,
Ningbo University, Zhejiang 315211, China
Keywords:
Non-parallel Hyperplane Classifier, Least Squares Twin Support Vector Machines, Manifold-preserving
Graph Reduction.
Abstract:
Least squares twin support vector machines are a new non-parallel hyperplane classifier, in which the primal
optimization problems of twin support vector machines are modified in least square sense and inequality
constraints are replaced by equality constraints. In classification problems, enhancing the robustness of least
squares twin support vector machines and reducing the time complexity of kernel function evaluation of a
new example when inferring the label of a new example are very important. In this paper, we propose a
new sparse least squares twin support vector machines based on manifold-preserving graph reduction which
is an efficient graph reduction algorithm with manifold assumption. This method first selects informative
examples for positive examples and negative examples, respectively and then applies them for classification.
Experimental results confirm the feasibility and effectiveness of our proposed method.
1 INTRODUCTION
Support vector machines (SVMs) are a very effi-
cient classification algorithm (Shawe-Taylor and Sun,
2011; Vapnik, 1995; Christianini and Shawe-Taylor,
2002; Ripley, 2002), which are based on the princi-
pled idea of structural risk minimization in statistical
learning theory. Compared with other machine learn-
ing algorithms, SVMs can obtain a better generaliza-
tion. They are well-known for their robustness, good
generalization ability, and unique global optimum so-
lution in the case of convex problem. Recent years
witnessed emergence of many successful non-parallel
hyperplane classifiers. Twin support vector machines
(TSVM) (Jayadeva et al., 2007) are a representative
non-parallel hyperplane classifier which aims to gen-
erate two non-parallel hyperplanes such that one of
the hyperplanes is closer to one class and as far as
possible from the other class. Twin bounded SVM
(TBSVM) (Shao et al., 2011) is an improved version
of TSVM whose optimization problems are changed
slightly by adding a regularization term with the idea
of maximizing the margin. TSVM has been extended
to these learning frameworks such as multi-task learn-
ing (Xie and Sun, 2015b), multi-view learning (Xie
and Sun, 2015a; Xie and Sun, 2014), semi-supervised
learning (Chen et al., 2016), multi-label learning (Qi
et al., 2012) and regression problem (Peng, 2010).
The two non-parallel hyperplanes of TSVM are ob-
tained by solving a pair of quadratic programming
problems (QPPs). Thus the time complexity is rel-
ative high. Least squares twin support vector ma-
chines (LSTSVM) (Kumar and Gopal, 2009) were
proposed to reduce the time complexity by chang-
ing the constraints to a series of equalities constraints
and leading to a pair of linear equations, and can
easily handle large datasets. Many improved vari-
ants of LSTSVM have been proposed, such as knowl-
edge based LSTSVM (Kumar et al., 2010), Laplacian
LSTSVM for semi-supervised classification (Chen
et al., 2014), Weighted LSTSVM (Mu et al., 2014).
However, enhancing the robustness of LSTSVM and
reducing the time complexity of kernel function eval-
uation of a new example when inferring the label of a
new example are very important.
One of sparse methods uses only a subset of the
data and focuses on the strategies of selecting the
representative examples to form the subset. These
methods lead to a significant reduction of the time
complexity. Although some methods such as ran-
dom sampling or k-means clustering can be used
to reduce the size of the graph, they have no guar-
antees of preserving the manifold structure or ef-
fectively removing outliers and noisy examples. In
Xie, X.
Sparse Least Squares Twin Support Vector Machines with Manifold-preserving Graph Reduction.
DOI: 10.5220/0006690805630567
In Proceedings of the 7th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2018), pages 563-567
ISBN: 978-989-758-276-9
Copyright © 2018 by SCITEPRESS – Science and Technology Publications, Lda. All rights reser ved
563