because some variants balance the data intrinsically,
or keep the current class ratio. Hence, it should al-
ways be considered, especially since it makes the al-
gorithms robust against long time occurrence of only
one class. Last but not least, we observed that it is al-
ways important to look at the interaction between the
selection strategies.
In future, we want to analyze different hybrid ap-
proaches between the selection strategies from SVM
and PA. Some inclusion strategies can be applied to
the PA and when removing samples from the training
set, their weights could be kept integrated into the lin-
ear classification vector. Additionally, we will com-
pare the most promising approaches on different data,
such as movement prediction with EEG or EMG,
and on different transfer setups which will come with
other kinds of data shifts. Last but not least, differ-
ent implementation strategies for efficient updates and
different strategies for unsupervised online learning
could be compared. In the latter, the relabeling cri-
terion is expected to be much more beneficial than in
our evaluation.
ACKNOWLEDGEMENTS
This work was supported by the Federal Min-
istry of Education and Research (BMBF, grant no.
01IM14006A).
We thank Marc Tabie and our anonymous review-
ers for giving useful hints to improve the paper.
REFERENCES
Alippi, C., Liu, D., Zhao, D., Member, S., and Bu, L.
(2014). Detecting and Reacting to Changes in Sensing
Units: The Active Classifier Case. IEEE Transactions
on Systems, Man, and Cybernetics: Systems, 44(3):1–
10.
Blankertz, B., Lemm, S., Treder, M., Haufe, S., and M
¨
uller,
K.-R. (2011). Single-Trial Analysis and Classifica-
tion of ERP Components–a Tutorial. NeuroImage,
56(2):814–825.
Bordes, A., Ertekin, S., Weston, J., and Bottou, L. (2005).
Fast Kernel Classifiers with Online and Active Learn-
ing. The Journal of Machine Learning Research,
6:1579–1619.
Courchesne, E., Hillyard, S. A., and Courchesne, R. Y.
(1977). P3 waves to the discrimination of targets in
homogeneous and heterogeneous stimulus sequences.
Psychophysiology, 14(6):590–597.
Crammer, K., Dekel, O., Keshet, J., Shalev-Shwartz, S., and
Singer, Y. (2006). Online Passive-Aggressive Algo-
rithms. Journal of Machine Learning Research, 7:551
– 585.
Cristianini, N. and Shawe-Taylor, J. (2000). An Introduction
to Support Vector Machines and other kernel-based
learning methods. Cambridge University Press.
Dekel, O., Shalev-Shwartz, S., and Singer, Y. (2008). The
Forgetron: A Kernel-Based Perceptron on a Budget.
SIAM Journal on Computing, 37(5):1342–1372.
Funaya, H., Nomura, Y., and Ikeda, K. (2009). A Support
Vector Machine with Forgetting Factor and Its Statis-
tical Properties. In K
¨
oppen, M., Kasabov, N., and
Coghill, G., editors, Advances in Neuro-Information
Processing, volume 5506 of Lecture Notes in Com-
puter Science, pages 929–936. Springer Berlin Hei-
delberg.
Gretton, A. and Desobry, F. (2003). On-line one-class sup-
port vector machines. An application to signal seg-
mentation. In 2003 IEEE International Conference
on Acoustics, Speech, and Signal Processing (ICASSP
’03), volume 2, pages 709–712. IEEE.
Hoens, T. R., Polikar, R., and Chawla, N. V. (2012). Learn-
ing from streaming data with concept drift and imbal-
ance: an overview. Progress in Artificial Intelligence,
1(1):89–101.
Hsieh, C.-J., Chang, K.-W., Lin, C.-J., Keerthi, S. S., and
Sundararajan, S. (2008). A dual coordinate descent
method for large-scale linear SVM. In Proceedings of
the 25th International Conference on Machine learn-
ing (ICML 2008), pages 408–415. ACM Press.
Kirchner, E. A., Kim, S. K., Straube, S., Seeland, A.,
W
¨
ohrle, H., Krell, M. M., Tabie, M., and Fahle, M.
(2013). On the applicability of brain reading for pre-
dictive human-machine interfaces in robotics. PloS
ONE, 8(12):e81732.
Kirchner, E. A., Tabie, M., and Seeland, A. (2014). Mul-
timodal movement prediction - towards an individual
assistance of patients. PloS ONE, 9(1):e85060.
Krell, M. M. (2015). Generalizing, Decoding, and Opti-
mizing Support Vector Machine Classification. Phd
thesis, University of Bremen, Bremen.
Krell, M. M., Straube, S., Seeland, A., W
¨
ohrle, H., Teiwes,
J., Metzen, J. H., Kirchner, E. A., and Kirchner, F.
(2013). pySPACE a signal processing and classifica-
tion environment in Python. Frontiers in Neuroinfor-
matics, 7(40):1–11.
Laskov, P., Gehl, C., Kr
¨
uger, S., and M
¨
uller, K.-R. (2006).
Incremental Support Vector Learning: Analysis, Im-
plementation and Applications. Journal of Machine
Learning Research, 7:1909–1936.
Li, Y., Guan, C., Li, H., and Chin, Z. (2008). A self-training
semi-supervised SVM algorithm and its application in
an EEG-based brain computer interface speller sys-
tem. Pattern Recognition Letters, 29(9):1285–1294.
Liang, Z. and Li, Y. (2009). Incremental support vector
machine learning in the primal and applications. Neu-
rocomputing, 72(10-12):2249–2258.
Lin, H.-T., Lin, C.-J., and Weng, R. C. (2007). A note
on Platts probabilistic outputs for support vector ma-
chines. Machine Learning, 68(3):267–276.
Mak, J., Arbel, Y., Minett, J., McCane, L., Yuksel, B., Ryan,
D., Thompson, D., Bianchi, L., and Erdogmus, D.
(2011). Optimizing the p300-based brain–computer
NEUROTECHNIX 2015 - International Congress on Neurotechnology, Electronics and Informatics
66