Table 2: Average accuracy for emotion recognition.
Data
base
Baseline
SMO
SMO
IGR
SMO
OGA
SMO
PS+MCGP
Berlin 12.99 16.9(35) 15.1(26)
25.1(23)
UUDB 86.04 85.5(40) 83.3(30)
87.7(25)
LEGO 13.46 20.3(19) 20.7(15)
45.2(16)
RSDB 32.38 23.8(15) 34.4(12)
80.9(10)
Table 3: Average accuracy for gender identification.
Data
base
Baseline
SMO
SMO
IGR
SMO
OGA
SMO
PS+MCGP
Berlin 24.30
99.1(38)
21.3(25) 92.5(24)
UUDB 34.77 65.4(39) 31.8(28)
87.1(26)
LEGO 36.89 61.2(17) 64.6(18)
70.3 (15)
RSDB 53.89 73.6(16) 52.8(15)
94.1(12)
The optimization technique with 10 steps was
applied, i.e. for Berlin database: first 5, 10, 20, ..., 45
features. The data sets were randomly divided into
training and test samples in a proportion of 80 -
20%. In all experiments PS+MCGPs were provided
with an equal amount of resources. The final
solution is the relevant feature set that is determined
by amount of input neurons in ANN in the second
part of the described algorithm. Tables 2 and 3
contain the relative classification accuracy for the
described corpora. In parentheses there is the
average number of selected features (Tables 2, 3).
The columns entitled SMO Baseline contain results
which were achieved with the baseline feature
selection methods. Similarly, the columns titled
SMO PS+MCGP, SMO IGR and SMO OGA
contain results obtained with PS+MCGP, IGR, OGA
feature selection procedures correspondingly.
6 CONCLUSIONS AND FUTURE
WORK
An application of the proposed hybrid algorithm in
order to select the relevant features and maximize
the accuracy of particular tasks could decrease the
number of features and increase the accuracy of the
system simultaneously. In most of the cases, the
PS+MCGP approach outperforms other algorithms.
Also the MCGP approach is able to create (select)
the optimal variant of the ANN classifier which
could be applied for improving effectiveness of
speaker state recognition problems. It should be
noted that the number of selected features using the
IGR, OGA methods is quite high. It means that in
some cases the number of features was equal to 41,
i.e. an optimal modeling procedure has been
conducted without feature selection at all.
The usage of effective classifiers may improve
the performance of the proposed approach. Thus, it
is important to estimate efficiency of classifiers in
cooperation with feature selection algorithms for
comprehensive improvement of recognition systems.
Additionally, the proposed approach can be
applied to improve the effectiveness of „real-time”
(on-line) systems. On-line processes always
accompany by different types of effects such as
noise, voice distortion, etc. that should be processed
in real time. Therefore, it can be useful to involve
the developed algorithm into on-line systems. For
example, the proposed approach can be applied for
improving real-time recognition of a psycho-
emotional state of a human. We assume, a study in
this direction allows creating a state recognition
procedure for real-time systems more accurate.
REFERENCES
Akthar, F. and Hahne, C., 2012. Rapid Miner 5 Operator
reference, Rapid-I. Dortmund. pages 25-55.
Ashish, G. and Satchidanada D., 2004. Evolutionary
Algorithm for Multi-Criterion Optimization: A Survey.
In International Journal of Computing & Information
Science. vol. 2, no. 1. pages 43-45.
Brester, Ch., Sidorov, M., Semenkin, E., 2014. Acoustic
Emotion Recognition: Two Ways of Features
Selection Based on Self-Adaptive Multi-Objective
Genetic Algorithm. In Informatics in Control,
Automation and Robotics. ICINCO`14. pages 851 -
855.
Burkhardt, F., Paeschke, A., Rolfes M., Sendlmeier, W. F.,
Weiss, B., 2005. A database of german emotional
speech. In Interspeech. pages 1517–1520.
Fan, R. K. and Chung, Spectral, 1997. Graph Theory. In
Regional Conference Series in Mathematics. no. 92.
pages 2-5.
Golub, T. R., Slonim D. K., Tomayo, P., Huard, C.,
Gaasenbeek M., Mesirov J. P., Coller, H., Loh, M. L.,
Downing J. R., Caligiuri, M. A., Bloofield, C. D. and
Lander, E. S. 1999. Molecular classification of cancer:
Class discovery and class prediction by gene
expression monitoring. In Science. vol. 286. pages
531-537.
He, X., Cai, D., and Niyogi, P., 2005. Learning a
Laplacian Score for Feature Selection. In Advances in
Neural Information Processing Systems 18. NIPS.
pages 14-14.
Holland, J. H., 1975. Adaptation in Natural and Artificial
System. In University of Michigan Press. pages 18-35.
Koza, J.R., 1992. Genetic Programming: On the
Programming of Computers by Means of Natural
Selection, MIT Press. London. pages 109-120.
Liu, J., Iba, H. and Ishizuka, M. 2001. Selecting
informative genes with parallel genetic algorithms in