Application of the Device of Measurement of Bioelectric Activity of
Muscles and Nerve Structures for Gesture Recognition
Application of Gesture Recognition on the Example of Action Game
Levanov Alexey Alexendrovich
Department of Information Systems and Telecommunications, MSTU, 2nd Baumanskaya St., Moscow, Russia
Keywords: Gesture Recognition, Games.
Abstract: Using the device bioPlux we can identify a set of user gestures, and based on them to create a multimodal
interface. Gestures are selected so that they are not dependent on each other. The main goal of the research
is to create a simple game which can be controlled by using different hand movements. Relevance of the
topic from the practical point of view is determined by the need to create a software system that can use sign
language interface in real time.
1 INTRODUCTION
1.1 Relevance of the Topic
Currently, research and development of human-
machine interfaces based on pattern recognition and
visualization of multimedia information is leading
edge in the development of modern software. The
developers of these interfaces are interested in
natural ways for people to communicate with
computers using gestures, voice and other
modalities. Gestures are particularly promising for
building interfaces to control software and the
hardware of computers, robots, to empower the
interface for people with hearing problems and
speech disorders. Relevance of the topic from the
practical point of view is determined by the need to
create a software system that can use sign language
interface in real time. The advantages of this system
compared to the system of pattern recognition is a
higher accuracy.
1.2 The Purpose and Objectives of the
Research
The aim of the research is to develop a common
methodology for capturing, tracking and gesture
recognition of the arm movements in the real time
with high reliability to create a human-machine
interface based on bioPlux device.
To implement this goal, the following objectives:
- Carry out a comparative analytical review of
existing methods of capturing, tracking and
dynamic gesture recognition of the human.
- Develop a computationally efficient algorithm
for capturing and tracking and gesture
recognition using a device bioPlux.
- Create a real-time system that uses these
algorithms to dynamically change its state
depending on the user’s gesture.
- Conduct experiments to assess the reliability and
efficiency of the system in real time, confirming
the theoretical results.
1.3 Description of the Device
BioPLUX device collects and digitizes the signals
from the sensors and transmit it to the computer
through Bluetooth, where they can be processed.
BioPLUX channels have 12 bit and sampling rate of
1000 Hz. BioPLUX also has a digital port, the
terminal to connect the AC adapter and charge the
internal battery (with the regime of autonomy 12
hours), and a channel to connect electrode (ground),
which are necessary for proper monitoring
electromyographic signals.
301
Alexendrovich L..
Application of the Device of Measurement of Bioelectric Activity of Muscles and Nerve Structures for Gesture Recognition - Application of Gesture
Recognition on the Example of Action Game.
DOI: 10.5220/0004365803010305
In Proceedings of the International Conference on Biomedical Electronics and Devices (MHGInterf-2013), pages 301-305
ISBN: 978-989-8565-34-1
Copyright
c
2013 SCITEPRESS (Science and Technology Publications, Lda.)
Figure 1: The look and characteristics of the instrument.
2 GESTURE MODALITIES
The scientific interest in the verbal (speech) and
nonverbal (gestures, mimicry, touches, etc.)
behavior of people during the communication arose
only in XX century. The theory of verbal and
nonverbal communications for a long time was
developed at an intuitive level. Serious scientific
investigations of verbal and nonverbal
communications began in the 1920s–1930s within
the framework of the journalism theory.
Psychologists established that a percentage of
information transferred by nonverbal signals during
the people interaction was from 60% up to 80%
[Ekman & Friesen, 1969].
Moreover, most researchers adhere to an opinion
that the verbal channel is used substantially for
transferring the factographic information while the
nonverbal channel is a means of transferring the
interpersonal relations and only in rare cases it is
used instead of verbal messages. This fact testifies to
the important role of the nonverbal information
transferred by gestures and mimicry for the people
behavior analysis and developing human-machine
interface in computer games. For the most part
scientific matters deal with gestures performed by
hands.
Generally, a gesture is the sign unit carried out
by human body parts consciously and unconsciously
for the purpose of communications. In order to
decode the information incorporated in gestures it is
needed to define their classification. Gestures are
subdivided into natural and artificial ones. Natural
gestures are inherent in a person by nature or are
produced by the humanity during the evolution.
Gesture classifiers describe images and senses of
gestures to use them with a high degree of adequacy.
It should be noted, that all positions are connected
with knowledge of environment properties in which
the gesture is made or with knowledge of the context
accompanying the gesture. And many cultures
interpret the same gesture completely in a different
way. Moreover, the frequency of gesticulation (a
number of gestures made per unit of time) in West
Europe is higher than in Russia, but gestures of West
Europeans occupy less space, than those of Russians
as West Europeans gesticulate with elbows pressed
to the body. West European gestures do not intrude
at all in the personal space of the interlocutor. As
against the Russian tradition, in West Europe
symmetric gestures prevail, a handshake is less long,
than that in Russia, and gestures are made by the
half-bent arm, instead of the outstretched arm as in
Russia.
3 RECEIPT AND PROCESSING
OF SIGNALS
With a special API we have created software that
allows getting real-time data from sensors attached
to the hands of the user and, after processing and
recognizing, change the status in the software.
To select the suitable gestures for recognizing,
we have made a series of experiments. We identified
five independent unconnected gestures associated
with the work of different muscles, or a
superposition of their work. After that, we have
chosen two gestures, which are used in our gaming
application.
Those gestures are:
Normal state of the hand:
Figure 2: Normal state of the hand.
BIODEVICES2013-InternationalConferenceonBiomedicalElectronicsandDevices
302
Hand in fist:
Figure 3: Hand in a fist.
But how have we analyze all the data, coming from
device? 1000 Hz means 1000 values per second –
and it really a lot of information. First off all, we
have made thousands of measures and saved all
values to different files. Then, we analyze each file
to find maximum and minimum value. Next – we
have found those gestures, that are independent to
each other on the same channels.
How is our sample application works? There is a
ball on the screen. When user rotates his or her hand,
the ball starts the movement. When user clenched
his or her fists, the ball starts jumping.
Figure 4: Hand in a fist – ball is jumping.
Figure 5: Rotated hand – ball is rolling.
4 THE RESULTS OF THE
SYSTEM
This section presents data about the accuracy and
reliability of the developed system. To test the
system we have conducted a series of experiments.
In these experiments, the user controls the
movement of the ball with the help of special
gestures. We evaluated the accuracy and reliability
of the gesture recognition. Data illustrating these
experiments are shown below:
Table 1: Accuracy of the system.
Data
The total
number of
tests
The number of
false
measurements
Percentage of
successful
recognition
Data for
the right
hand
500 0 100%
Data for
the left
hand
500 2 99,6%
Table 2: Reliability of the system.
Data
The total
number of
tests
Number of
failures in the
system
Percentage of
stable system
operation
Data for
the right
hand
500 3 99,4%
Data for
the left
hand
500 1 99,8%
5 CONCLUSIONS
Have been solved the problem of dynamic gesture
recognition of human right and left hand. Shows the
experimental evidence of high accuracy, stability
and speed of the system. In the future we plan to add
other gestures for recognition and more advanced
graphics to create full biofeedback system.
REFERENCES
Alon, J., Athistos, V., Yuan, Q., and Sclaroff S. (2005).
Simultaneous Localization and Recognition of
Dynamic Hand Gestures. Proc. of WACV
MOTION’05, 2, 254-260.
Avilts-Aniaga, H., H., Sucart, L., E., and Mendozaz, C., E.
ApplicationoftheDeviceofMeasurementofBioelectricActivityofMusclesandNerveStructuresforGestureRecognition
-ApplicationofGestureRecognitionontheExampleofActionGame
303
(2003). Visual Gesture Recognitions Using Dynamic
Naive Bayesian Classifiers. Proc. of IEEE Internat.
Workshop on Robot and Human Interac. Com.-
Milibrae, 133-138.
Bobick, A. F., & Wilson, A. D. (1997). A State-Based
Approach to the Representation and Recognition of
Gesture. Proc. of IEEE Transactions on pattern
analysis and machine intelligence, 19(12), 1325–1337.
Brown, L., G. (1992). A Survey of Image Registration
Techniques. Computing Surveys, 24(4), 325-376.
Umpire Gesture Recognition. Structural, Syntactic, and
Stat. Pat. Recog., Vol. 3138, pp. 859-867.
Cutler, R., & Turk, M. (1998). View-based Interpretation
of Real-time Optical Flow for Gesture Recognition.
Proc. of Third IEEE Intern. Conf. on Autom. Face and
Gesture Recog. Nara, 416-421.
Darwiche, A. A. (2001). Differential Approach to
Inference in Bayesian Networks. Journal of the ACM,
50(3), 280 -305.
Davis, J.W., & Shah, M. (1992). Gesture Recognition.
Proc. of European Conf. Comp. Vis., 331–340.
Devyatkov, V. & Alfimtsev, A. (2008). Optimal Fuzzy
Aggregation of Secondary Attributes in Recognition
Problems. Proc. of 16-th International Conference in
Central Europe on Computer Graphics. Visualization
and Computer Vision. Plzen, 78–85.
Devyatkov, V., & Alfimtsev, A. (2009). Dynamic Gesture
Recognition Using Fuzzy Model. Proc. of the 13th
World Multi-Conference on Systemics, Cybernetics
and Informatics (WMSCI 2009).-Orlando, USA.- p.
145-150.
Ekman,P., & Friesen,W. (1969). The Repertoire of
Nonverbal Behavior: Categories, Origins, Usage and
Coding. Semiotica, 1, 49–98.
Freeman, W.T., Tanaka, K., Ohta, J., Kyuma K. (1996).
Computer Vision for Computer Games. In Proc. IEEE
Int. Conf. on Face & Gesture Recognition, 100–105.
Frei, W., & Chen, C. C. (1977). Fast Boundary Detection:
A Generalization and New Approach. IEEE Trans.
Comput., 26(10), 988-998.
Garcia, C., & Tziritas, G. (1999). Face Detection Using
Quantized Skin Color Regions Merging and Wavelet
Packet Analysis. IEEE Transactions on multimedia,
1(3), 264-277.
Gould, K., & Shah, M. (1989). The Trajectory Primal
Sketch: A MultiScale Scheme for Representing
Motion Characteristics. Proc. of Comp. Vis. and
Pattern Rec, 79–85.
Graetzel, C., Fong, T.,W., Grange, S., and Baur, C. (2004)
A Non-Contact Mouse for Surgeon-Computer
Interaction. J. Tech and Health Care, 12(3), 245-257
Horn В., & Schunck, B. (1998). Determining Optical
Flow. Artificial Intelligence, 17, 185-203.
Johansson, G. (1964) Perception of Motion and Changing
Form. Scandanavian J. Psychology, 5, 181-208.
Kang, H., Lee, C. W., Jung, K. (2004) Recognition-Based
Gesture Spotting in Video Games. Pattern Rec. Let.,
V. 25, I. 15, 1701-1714.
Keir, P. Elgoyhen, J. Naef, M. Payne, J. Horner, M.
Anderson, P. (2006). Gesture-Recognition with
Nonreferenced Tracking. 1st IEEE Symposium on 3D
User interfaces, 137.
Kirsch, R. (1977). Computer Determination of the
Constituent Structure of Biological Images. Comput.
Biomed, 4(3), 315-328.
Kim, N., An, Y., Cha B. (2009). Gesture Recognition
Based on Neural Networks for Dance Game Contents.
International Conference on New Trends in
Information and Service Science, 134-1139.
Kwak, K., & Pedrycz, W. (2005). Face Recognition: A
Study in Information Fusion Using Fuzzy Integral.
Patt. Recog. Lett, 26, 719-733.
Kyle, J. & Woll B. (1988) Sign Language: The Study of
deaf People and Their Language. Cambridge
University Press, 328 p.
Lienhart, R., & Maydt J. (2002). An Extended Set of
Haar-like Features for Rapid Object Detection. IEEE
ICIP, 1, 900-903.
Lienhart, R., Kuranov, A., Pisarevsky, V. (2003).
Empirical Analysis of Detection Cascades of Boosted
Classifiers for Rapid Object Detection. Proc. of
DAGM03, 297-304.
Liu, Z. (2001). Dynamic Image Sequence Analysis Using
Fuzzy Measures. IEEE trans. on sys., man, and
cybern, 31(4), 557-572.
Mamiya, H., Sato, T., Fukuchi, K., Koike, H. (2007) A
Tabletop Entertainment System and Finger Tapping
Gesture Recognition. In Proceedings of WISS, JSSST,
53–58.
Miklós, I., & Meyer, I. (2005) A Linear Memory
Approach for Baum-Welch Training. BMC
Bioinformatics, 6(231), 1471-2105.
Nishikawa, A., Hosoi, T., Koara, K., Negoro, D., Hikita,
A., Asano, S., Kakutani, H., Miyazaki, F., Sekimoto,
M., Yasui, M., Miyake, Y., Takiguchi, S., and
Monden, M. (2003). FAce MOUSe: A Novel Human-
Machine Interface for Controlling the Position of a
Laparoscope. IEEE Trans on Robotics and Automation
19:5:825-841.
Ong, S., & Ranganath, S. (2005). Automatic Sign
Language Analysis: A Survey and the Future beyond
Lexical Meaning. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 5(6), 873-891.
Park, J. Y., Yi, J. H. (2008). Gesture Recognition Based
Interactive Boxing Game. Scientific Literature Digital
Library: http://www.icis.ntu.edu.sg/scs-ijit/1207/ijit-
1207_05.pdf
Patel, S. (1995). A lower-complexity Viterbi approach.
Acoustics, Speech, and Signal Processing, 1, 592-595.
Rabiner, L., & Juang, B.H. (1993). Fundamentals of
Speech Recognition. Prentice Hall.
Rett, J., & Dias, J. (2006). Gesture Recognition Using a
Marionette Model and Dynamic Bayesian Networks.
Lecture notes in computer science, 4142, 69-80.
Rigoll, G., Kosmala, A., and Eickeler, S. (1997). High
Performance Real-Time Gesture Recognition Using
Hidden Markov Models. Proc. of the Internat. Gesture
Workshop on Gesture and Sign Lang. in Human-
Computer Interac, 69-80.
Russel, S.J., & Norvig, P. (2002). Artificial Intelligence. A
BIODEVICES2013-InternationalConferenceonBiomedicalElectronicsandDevices
304
modern approach. Upper Saddle River/new Jersey,
Prentice Hall.
Sandberg, A. (1997). Gesture Recognition using Neural
Networks. Master thesis. Stockholm.
Schultz M, Gill J, Zubairi S, Huber R, Gordin F (2003)
Bacterial Contamination of Computer Keyboards in a
Teaching Hospital. Infect Control Hosp Epidemiol
24:302-303
Schumeyer, R. P., & Barner, K. E. (1998). A Color-Based
Classifier for Region Identification in Video. SPIE
Visual Communications Image Processing, 3309, 189-
200.
Shapiro, L.G., & Stockman, G.S. (2001). Computer
Vision. Upper Saddle River, N.J., Prentice-Hall.
Sharma, R. (2003). Speech-Gesture Driven Multimodal
Interfaces for Crisis Management. Proc. of the IEEE,
91, 1327–1354.
Sigal, L., & Sclaroff S. (2004). Skin Color-Based Video
Segmentation under Time-Varying Illumination. IEEE
Transactions on pattern analysis and machine
intelligence, 26(7), 862-877.
Silva, M., Courboulay, V., Prigent, A., Estraillier, P.
(2008). Real-Time Face Tracking for Attention Aware
Adaptive Games. ICVS 2008, 99–108.
Starner, T., Weaver, J., and Pentland, A. (1998). Real-
Time American Sign Language Recognition Using
Desk and Wearable Computer Based Video. IEEE
Trans. Pattern Analysis and Machine Intelligence,
20(12), 1371–1375.
Song, P., Yu, H., Winkler, S. (2009). Vision-based 3D
Finger Interactions for Mixed Reality Games with
Physics Simulation. The International Journal of
Virtual Reality, 8(2):1-6.
Su, J., & Zhang, H. (2005). Full Bayesian Network
Classifiers. Proc. of the 23rd international conference
on Machine learning, 897 - 904.
Tahani, H., & Keller, J. M. (1990). Information Fusion in
Computer Vision Using the Fuzzy Integral. IEEE
transactions on systems, man, and cybernetics, 20(3),
733-741.
Tomasi, C., Petrov, S., and Sastry, A. (2003). 3D Tracking
= Classification + Interpolation. Proc. of Int. Conf.
Computer Vision, 1441–1448.
Viola, P., & Jones, M. (2001) Rapid Object Detection
using a Boosted Cascade of Simple Features. IEEE
CVPR, 1, 511-518.
Winkler, S., Yu, H., Zhou, Z.Y. (2007). Tangible Mixed
Reality Desktop for Digital Media Management. In
SPIE Engineering Reality of Virtual Reality, Vol.
6490B.
Winstone, P.G. (1992). Artificial Intelligence.
Reading/Massachusetts, Addison-Wesley Publishing
Company.
Wong, S.F., & Cipolla, R. (2006). Continuous Gesture
Recognition using a Sparse Bayesian Classifier. Proc
of 18th Internat. Conf. on Pattern Recognition, 1084-
1087.
Wu, H., Chen, Q., and Yachida, M. (1999). Face Detection
From Color Images Using a Fuzzy Pattern Matching
Method. IEEE Transactions on pattern analysis and
machines intelligence, 21(6), 557-563.
Yamato, J., Ohya, J., and Ishii, K. (1992). Recognizing
Human Action in Time-Sequential Images Using
Hidden Markov Model. Proc. of Comp. Vis. and
Pattern Rec, 379–385.
Yanagihara, Y., & Hiromitsu, H. (2000). System for
Selecting and Generating Images Controlled by Eye
Movements Applicable to CT Image Display. Medical
Imaging Technology, September, 18(5), 725-733
Yang, J., & Waibel, A. (1996) A Real-Time Face Tracker.
Proc. of the Third IEEE Workshop on Applicat. of
Comp. Vision, 142-147.
Ye, G., Corso, J., Hager, G., (2004). Gesture Recognition
Using 3D Appearance and Motion Features. Proc. of
Workshop on Real-time Vision for Human-Computer
Interaction, 160-161.
Zeng, T., J., Wang, Y., Freedman, M., T., and Mun, S., K.
(1997). Finger Tracking for Breast Palpation
Quantification Using Color Image Features. SPIE
Optical Eng., 36(12), 3455-3461
ApplicationoftheDeviceofMeasurementofBioelectricActivityofMusclesandNerveStructuresforGestureRecognition
-ApplicationofGestureRecognitionontheExampleofActionGame
305