
standing of HAR, especially in the context of health
monitoring and assessment.
ACKNOWLEDGEMENTS
The study is funded by the German Federal Ministry
of Education and Research (Project No. 01ZZ2007).
REFERENCES
Abbaspour, S., Fotouhi, F., Sedaghatbaf, A., Fotouhi, H.,
Vahabi, M., and Linden, M. (2020). A comparative
analysis of hybrid deep learning models for human ac-
tivity recognition. Sensors, 20(19):5707.
Aboo, A. K. and Ibrahim, L. M. (2022). Human activity
recognition using a hybrid lstm-cnn deep neural net-
work. Webology.
Augustinov, G., Nisar, M. A., Li, F., Tabatabaei, A., Grze-
gorzek, M., Sohrabi, K., and Fudickar, S. (2023).
Transformer-based recognition of activities of daily
living from wearable sensor data. In iWOAR ’22,
pages 1–8, Rostock, Germany. ACM.
Bock, M., Hoelzemann, A., Moeller, M., and Laerhoven,
K. V. (2022). Investigating (re)current state-of-the-art
in human activity recognition datasets. Frontiers in
Computer Science, 4.
Ciortuz, G., Grzegorzek, M., and Fudickar, S. (2023).
Effects of time-series data pre-processing on the
transformer-based classification of activities from
smart glasses. In iWOAR ’23, New York, NY, USA.
Association for Computing Machinery.
Dave, R., Seliya, N., Vanamala, M., and Tee, W. (2022).
Human activity recognition models using limited con-
sumer device sensors and machine learning. CoRR,
abs/2201.08565.
Friedrich, B., Cauchi, B., Hein, A., and Fudickar, S. (2019).
Transportation mode classification from smartphone
sensors via a long-short-term-memory network. page
709–713, New York, NY, USA. Association for Com-
puting Machinery.
Gil-Mart
´
ın, M., San-Segundo, R., Fern
´
andez-Mart
´
ınez, F.,
and de C
´
ordoba, R. (2020). Human activity recogni-
tion adapted to the type of movement. Computers &
Electrical Engineering, 88:106822.
Hans Van Remoortel, S. G., Raste, Y., Burtin, C., Louvaris,
Z., Gimeno-Santos, E., Langer, D., Glendenning, G.,
Hopkinson, N., Vogiatzis, I., Peterson, B., Wilson, F.,
Mann, B., Rabinovich, R., Puhan, M., and Troosters,
T. (2012). Validity of activity monitors in health and
chronic disease: A systematic review. The interna-
tional journal of behavioral nutrition and physical ac-
tivity, 9:84.
Hellmers, S., Kromke, T., Dasenbrock, L., Heinks, A.,
Bauer, J. M., Hein, A., and Fudickar, S. (2018).
Stair climb power measurements via inertial measure-
ment units - towards an unsupervised assessment of
strength in domestic environments. In BIOSTEC 2018
- HEALTHINF, pages 39–47, Funchal.
Hussain, Z., Sheng, Q. Z., and Zhang, W. E. (2020). A
review and categorization of techniques on device-free
human activity recognition. Journal of Network and
Computer Applications, 167:102738.
Irshad, M. T., Nisar, M. A., Huang, X., Hartz, J., Flak, O.,
Li, F., Gouverneur, P., Piet, A., Oltmanns, K. M., and
Grzegorzek, M. (2022). Sensehunger: Machine learn-
ing approach to hunger detection using wearable sen-
sors. Sensors, 22(20).
Kleiner, A. F. R., Pacifici, I., Vagnini, A., Camerota, F., Cel-
letti, C., Stocchi, F., De Pandis, M. F., and Galli, M.
(2018). Timed up and go evaluation with wearable
devices: Validation in parkinson’s disease. Journal of
Bodywork and Movement Therapies, 22(2):390–395.
Li, F., Shirahama, K., Nisar, M. A., Huang, X., and Grze-
gorzek, M. (2020). Deep transfer learning for time se-
ries data based on sensor modality classification. Sen-
sors, 20(15):4271.
Li, F., Shirahama, K., Nisar, M. A., K
¨
oping, L., and Grze-
gorzek, M. (2018). Comparison of feature learning
methods for human activity recognition using wear-
able sensors. Sensors, 18(2).
Mahmud, S., Tonmoy, T. H. M., Bhaumik, K. K., Rahman,
M. A. K., Amin, A. M., Shoyaib, M., Khan, M. A. H.,
and Ali, A. A. (2020). Human activity recognition
from wearable sensor data using self-attention.
Mekruksavanich, S., Jitpattanakul, A., Youplao, P., and Yu-
papin, P. (2020). Enhanced hand-oriented activity
recognition based on smartwatch sensor data using
lstms. Symmetry, 12:1570.
Morshed, M. G., Sultana, T., Alam, A., and Lee, Y.-K.
(2023). Human action recognition: A taxonomy-
based survey, updates, and opportunities. Sensors,
23(4).
Nisar, M. A., Shirahama, K., Li, F., Huang, X., and
Grzegorzek, M. (2020). Rank pooling approach for
wearable sensor-based ADLs recognition. Sensors,
20(12):3463.
Ord
´
o
˜
nez, F. J. and Roggen, D. (2016). Deep convolutional
and lstm recurrent neural networks for multimodal
wearable activity recognition. Sensors, 16(1).
Phyo, P. and Byun, Y. (2021). Hybrid ensemble deep
learning-based approach for time series energy predic-
tion. Symmetry, 13:1942.
Roggen, D., Calatroni, A., Nguyen-Dinh, L.-V., Chavar-
riaga, R., and Sagha, H. (2012). OPPORTUNITY
Activity Recognition. UCI Machine Learning Repos-
itory.
Roy, B., Malviya, L., Kumar, R., Mal, S., Kumar, A.,
Bhowmik, T., and Hu, J. (2023). Hybrid deep learn-
ing approach for stress detection using decomposed
eeg signals. Diagnostics, 13.
Uddin, M. Z. and Soylu, A. (2021). Human activity recog-
nition using wearable sensors, discriminant analysis,
and long short-term memory-based neural structured
learning. Scientific Reports, 11(1).
Zhang, S., Li, Y., Zhang, S., Shahabi, F., Xia, S., Deng, Y.,
and Alshurafa, N. (2022). Deep learning in human
activity recognition with wearable sensors: A review
on advances. Sensors, 22(4).
Evaluating Movement and Device-Specific DeepConvLSTM Performance in Wearable-Based Human Activity Recognition
753