to achieve better recognition quality and more than
twice lesser training time when we used Hamming
metric and the proposed metric compared to the ex-
isting algorithm. The proposed metric again behaved
slightly better.
Table 2: Results for a dataset with time and amplitude dis-
tortions up to 30%, test trajectory length of 3000 points and
2 abnormal behavior classes.
e
I
e
II
Training time
Existing algorithm 18 0% 2 h. 33 min.
Trivial metric 29 0% 1 h. 33 min.
Jaccard metric 52 0% 1 h. 18 min.
Hamming metric 6 0% 54 min.
Proposed metric 1 0% 53 min.
Overall results show that recognizers that are con-
structed with the proposed algorithm and use either
Hamming metric or metric proposed by the authors
can achieve better recognition quality while requiring
less time for training even in the presence of ampli-
tude and time distortions up to 30%.
8 CONCLUSIONS
This paper considers the problem of automatic con-
struction of algorithms that recognize segments of ab-
normal behavior in multidimensional phase trajecto-
ries of dynamic systems. The recognizers are con-
structed using a set of examples of normal and abnor-
mal behavior of the system. We employ axiomatic ap-
proach to abnormal behavior recognition to construct
recognizers of abnormal behavior. In this paper we
propose a modification of the way a set of axioms
is transformed into an axiom system during recog-
nizer construction. This modification implies modi-
fication of the training and recognition algorithm. We
present modified genetic recognizer construction al-
gorithm and DTW-based search algorithm.
Results of experimental evaluation of the pro-
posed algorithms show that they allowed to decrease
the number of errors by one order of magnitude com-
pared to the old training and recognition algorithms
and recognizer training took less than half of the time
it took to train a recognizer using the old algorithms.
REFERENCES
Cover, T. and Hart, P. (1967). Nearest neighbor pattern clas-
sification. Information Theory, IEEE Transactions on,
13(1):21–27.
Hamming, R. W. (1950). Error detecting and error correct-
ing codes. Bell System Tech. J., 29:147–160.
Hassani, H. (2007). Singular spectrum analysis: Method-
ology and comparison. Journal of Data Science,
5(2):239–257.
Haykin, S. (1998). Neural Networks: A Comprehensive
Foundation. Prentice Hall PTR, Upper Saddle River,
NJ, USA, 2nd edition.
Keogh, E. J. and Pazzani, M. J. (2001). Derivative dynamic
time warping. In First SIAM International Conference
on Data Mining (SDM2001).
Kostenko, V. A. and Shcherbinin, V. V. (2013). Training
methods and algorithms for recognition of nonlinearly
distorted phase trajectories of dynamic systems. Opti-
cal Memory and Neural Networks, 22:8–20.
Kovalenko, D. S., Kostenko, V. A., and Vasin, E. A. (2005).
Investigation of applicability of algebraic approach to
analysis of time series. In Proceedings of II Interna-
tional Conference on Methods and Tools for Informa-
tion Processing, pages 553–559. (in Russian).
Kovalenko, D. S., Kostenko, V. A., and Vasin, E. A. (2010).
A genetic algorithm for construction of recognizers
of anomalies in behaviour of dynamical systems. In
Proceedings of 5th IEEE Int. Conf. on Bio Inspired
Computing: Theories and Applications, pages 258–
263. IEEEPress.
M
¨
uller, M. (2007). Information Retrieval for Music and
Motion. Springer-Verlag New York, Inc., Secaucus,
NJ, USA.
Rudakov, K. V. and Chekhovich, Y. V. (2003). Algebraic
approach to the problem of synthesis of trainable al-
gorithms for trend revealing. Doklady Mathematics,
67(1):127–130.
Tan, P.-N., Steinbach, M., and Kumar, V. (2005). Introduc-
tion to Data Mining, (First Edition). Addison-Wesley
Longman Publishing Co., Inc., Boston, MA, USA.
Vapnik, V. (1998). Statistical Learning Theory. Wiley-
Interscience.
Vorontsov, K. V. (2004). Combinatorial substantiation of
learning algorithms. Journal of Comp. Maths Math.
Phys, 44(11):1997–2009.
IJCCI2013-InternationalJointConferenceonComputationalIntelligence
110