Authors:
Frédéric Li
1
;
Lukas Köping
1
;
Sebastian Schmitz
2
and
Marcin Grzegorzek
3
Affiliations:
1
Research Group for Pattern Recognition, Germany
;
2
Fraunhofer SCAI, Germany
;
3
University of Economics in Katowice, Poland
Keyword(s):
Gesture Recognition, Particle Filter, Gesture Spotting, Dynamic Time Warping, DTW Barycenter Averaging.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Bayesian Models
;
Cardiovascular Imaging and Cardiography
;
Cardiovascular Technologies
;
Classification
;
Computer Vision, Visualization and Computer Graphics
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Methodologies and Methods
;
Motion and Tracking
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Physiological Computing Systems
;
Signal Processing
;
Software Engineering
;
Theory and Methods
Abstract:
In this paper we present an approach for real-time gesture recognition using exclusively 1D sensor data, based
on the use of Particle Filters and Dynamic Time Warping Barycenter Averaging (DBA). In a training phase,
sensor records of users performing different gestures are acquired. For each gesture, the associated sensor
records are then processed by the DBA method to produce one average record called template gesture. Once
trained, our system classifies one gesture performed in real-time, by computing -using particle filters- an
estimation of its probability of belonging to each class, based on the comparison of the sensor values acquired
in real-time to those of the template gestures. Our method is tested on the accelerometer data of the Multimodal
Human Activities Dataset (MHAD) using the Leave-One-Out cross validation, and compared with state-of-the-art approaches (SVM, Neural Networks) adapted for real-time gesture recognition. It manages to achieve
a 85.30% average accuracy an
d outperform the others, without the need to define hyper-parameters whose
choice could be restrained by real-time implementation considerations.
(More)