Sensor-based Pattern Recognition Identifying
Complex Upper Extremity Skills
Ryanne Lemmens
1,2
, Yvonne Janssen-Potten
1,2
, Annick Timmermans
1,3
,
Rob Smeets
1,2
and Henk Seelen
1,2
1
Department of Rehabilitation Medicine, Research School CAPHRI, Maastricht University, Maastricht, Netherlands
2
Adelante, Centre of Expertise in Rehabilitation and Audiology, Hoensbroek, Netherlands
3
BIOMED Biomedical Research Institute, Hasselt University, Hasselt, Belgium
1 OBJECTIVES
Objectively quantifying actual arm-hand
performance is very important to evaluate arm-hand
therapy efficacy in patients with neurological
disorders. Currently, objective assessments are
limited to evaluation of ‘general arm hand activity’,
whereas monitoring specific arm-hand skills is not
available yet. Instruments to identify skills and
determine both amount and quality of actual arm-
hand use in daily life are lacking, necessitating the
development of a new measure. To identify skills,
pattern recognition techniques can be used.
Commonly used pattern recognition approaches are:
statistical classification, neural networks, structural
matching and template matching (Jain et al., 2000).
The latter is used in the present study, aiming to
provide proof-of-principle of identifying skills,
illustrate this for the skill drinking in a standardized
setting and daily life situation in a healthy subject.
2 METHODS
Four sensor devices, each containing a tri-axial
accelerometer, tri-axial gyroscope and tri-axial
magnetometer were attached to the dominant hand,
wrist, upper arm and chest of participants. Thirty
healthy individuals performed the skill drinking 5
times in a standardized manner, i.e. with similar
starting position and instruction about how to
perform the skill. In addition, for one person a 30
minute registration in daily life including multiple
skills (of which 4 times the skill drinking) was
made.
Signals were filtered with a 4
th
order zero-time
lag low-pass Butterworth filter (cut off frequency:
2.5 Hz). Data analysis consisted of the following
steps: 1) temporal delimitation of each of the five
attempts of the skill drinking, i.e. identifying the
start and endpoint of each attempt recorded; 2)
normalization of the signals in the time domain in
order to correct for (small) variations due to
differences in speed of task execution; 3) averaging
signal matrices from the five attempts of each
individual person to obtain the individual template,
i.e. the underlying ensemble averaged signal matrix
per task per individual; averaging signal matrices
from the individual templates of multiple persons, to
create a generic template; 4) identification of
dominant sub phases of templates, within a specific
task, using Gaussian-based linear envelope
decomposition procedures; 5) recognition of specific
skill execution among various skills performed
daily, i.e. searching for template occurrence among
signal recordings gathered in a standardized setting
and a daily life condition, using feature extraction
and pattern recognition algorithms based on 2D
convolution. Cross-correlation coefficients were
calculated to quantify goodness-of-fit.
3 RESULTS
Performance of the skill drinking was identified
unambiguously (100%) in de standardized setting
(figure 1a). For the templates consisting of the
complete skill, mean cross-correlation was 0.93 for
the individual template and 0.79 for the generic
template. For the templates consisting of sub-phases,
mean cross-correlations ranged between 0.89 and
0.99 for the individual template and between 0.78
and 0.86 for the generic template.
In the daily life registration, all instances at
which drinking was performed, were recognized
with the template consisting of the complete skills
(mean cross-correlation: 0.51) (figure 1b). However,
also five false-positive findings were present (mean
cross-correlation: 0.46). Using the template
consisting of the sub phases, in general the skill
Lemmens R., Janssen-Potten Y., Timmermans A., Smeets R. and Seelen H..
Sensor-based Pattern Recognition Identifying Complex Upper Extremity Skills.
Copyright
c
2014 SCITEPRESS (Science and Technology Publications, Lda.)