5 CONCLUSION AND FUTURE
WORK
In this paper, the main motivation is to evaluate the
performance of activity recognition with wrist-worn
devices using inertial sensors and particularly anal-
yse the performance with different feature sets. We
categorize the set of features into three classes: mo-
tion related features, orientation-related features and
rotation-related features and we analyse the perfor-
mance using motion, orientation and rotation infor-
mation both alone and in combination. We utilize
a dataset collected from 10 participants with thir-
teen activities and use decision tree, naive Bayes and
random forest classification algorithms in the analy-
sis. The results show that using orientation features
achieve the highest accuracies when used alone and in
combination with other sensors. However, the com-
bination of all features (motion, orientation and ro-
tation) does not usually improve the results. Con-
sidering the average accuracies, random forest clas-
sifier achieves the highest performance. Additionally,
using only raw acceleration performs slightly better
(89%) than using linear acceleration and similar com-
pared with gyroscope. Hence, our results show that
using an accelerometer only solution can perform as
well as using linear acceleration or using both an ac-
celerometer and gyroscope. The main advantage is
that an acceleration-only solution consumes less bat-
tery power and this is an important factor for real-
time, continuous-running applications.
We are currently collecting a dataset using smart
watches and particularly focusing on the recognition
of smoking. As a future work, we plan to apply the
same methodology to the new dataset. Moreover, we
aim to apply feature selection methods and reduce the
number of features used and analyse the battery con-
sumption on a smart watch.
ACKNOWLEDGEMENTS
This work is supported by the Galatasaray Univer-
sity Research Fund under Grant Number 15.401.004,
by Tubitak under Grant Number 113E271 and by
Dutch National Program COMMIT in the context of
SWELL project.
REFERENCES
Alanezi, K. and Mishra, S. (2013). Impact of smartphone
position on sensor values and context discovery. Tech-
nical Report 1030, University of Colorado Boulder.
Avci, A., Bosch, S., Marin-Perianu, M., Marin-Perianu, R.,
and Havinga, P. (2010). Activity recognition using
inertial sensing for healthcare, wellbeing and sports
applications: A survey. In 23th International Con-
ference on Architecture of Computing Systems, ARCS
2010, pages 167–176.
Bulling, A., Blanke, U., and Schiele, B. (2014). A tuto-
rial on human activity recognition using body-worn
inertial sensors. ACM Computing Surveys (CSUR),
46(3):33.
Coskun, D., Incel, O., and Ozgovde, A. (2015). Phone posi-
tion/placement detection using accelerometer: Impact
on activity recognition. In Intelligent Sensors, Sensor
Networks and Information Processing (ISSNIP), 2015
IEEE Tenth International Conference on, pages 1–6.
Figo, D., Diniz, P. C., Ferreira, D. R., and Cardoso, J. M.
(2010). Preprocessing techniques for context recogni-
tion from accelerometer data. Personal and Ubiqui-
tous Computing, 14(7):645–662.
Incel, O., Kose, M., and Ersoy, C. (2013). A review and
taxonomy of activity recognition on mobile phones.
BioNanoScience, 3(2):145–171.
Incel, O. D. (2015). Analysis of movement, orientation and
rotation-based sensing for phone placement recogni-
tion. Sensors, 15(10):25474.
Kwapisz, J. R., Weiss, G. M., and Moore, S. A. (2011).
Activity recognition using cell phone accelerometers.
ACM SigKDD Explorations Newsletter, 12(2):74–82.
Lane, N. D., Miluzzo, E., Lu, H., Peebles, D., Choudhury,
T., and Campbell, A. T. (2010). A survey of mobile
phone sensing. Communications Magazine, IEEE,
48(9):140–150.
Maurer, U., Smailagic, A., Siewiorek, D. P., and Deisher,
M. (2006). Activity recognition and monitoring using
multiple sensors on different body positions. In Wear-
able and Implantable Body Sensor Networks, 2006.
BSN 2006. International Workshop on, pages 4–pp.
IEEE.
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V.,
Thirion, B., Grisel, O., Blondel, M., Prettenhofer,
P., Weiss, R., Dubourg, V., Vanderplas, J., Passos,
A., Cournapeau, D., Brucher, M., Perrot, M., and
Duchesnay, E. (2011). Scikit-learn: Machine learning
in Python. Journal of Machine Learning Research,
12:2825–2830.
Shoaib, M., Bosch, S., Durmaz Incel, O., Scholten, J., and
Havinga, P. J. (2015a). Defining a roadmap towards
comparative research in online activity recognition on
mobile phones.
Shoaib, M., Bosch, S., Incel, O. D., Scholten, H., and
Havinga, P. J. (2014). Fusion of smartphone mo-
tion sensors for physical activity recognition. Sensors,
14(6):10146–10176.
Shoaib, M., Bosch, S., Incel, O. D., Scholten, H., and
Havinga, P. J. (2015b). A survey of online ac-
tivity recognition using mobile phones. Sensors,
15(1):2059–2085.
Shoaib, M., Bosch, S., Incel, O. D., Scholten, H., and
Havinga, P. J. M. (2016). Complex human activity
Feature Engineering for Activity Recognition from Wrist-worn Motion Sensors
83