Authors:
Soumaya Nheri
1
;
Riadh Ksantini
1
;
2
;
Mohamed-Bécha Kaâniche
1
and
Adel Bouhoula
1
Affiliations:
1
Higher School of Communication of Tunis, Research Lab: Digital Security, University of Carthage, Carthage, Tunisia
;
2
University of Windsor, 401, Sunset Avenue, Windsor, ON, Canada
Keyword(s):
Support Vector Machine, Kernel Covariance Matrix, One-Class Classification, Outlier Detection, Low Variances, Subclass Information.
Abstract:
In order to handle spherically distributed data, in a proper manner, we intend to exploit the subclass information. In one class classification process, many recently proposed methods try to incorporate subclass information in the standard optimization problem. We presume that we should minimize the within-class variance, instead of minimizing the global variance, with respect to subclass information. Covariance-guided One-Class Support Vector Machine (COSVM) emphasizes the low variance direction of the training dataset which results in higher accuracy. However, COSVM does not handle multi-modal target class data. More precisely, it does not take advantage of target class subclass information. Therefore, to reduce the dispersion of the target data with respect to newly obtained subclass information, we express the within class dispersion and we incorporate it in the optimization problem of the COSVM. So, we introduce a novel variant of the COSVM classifier, namely Dispersion COSVM, t
hat exploits subclass information in the kernel space, in order to jointly minimize the dispersion within and between subclasses and improve classification performance. A comparison of our method to contemporary one-class classifiers on numerous real data sets demonstrate clearly its superiority in terms of classification performance.
(More)