ACKNOWLEDGEMENTS
This work has been supported by CTU grant no
SGS23/177/OHK3/3T/13, by the Czech Science
Foundation (GA
ˇ
CR) under research project No. 23-
07517S, and by the Europen Union under the project
Robotics and advanced industrial production (reg.
no. CZ.02.01.01/00/22 008/0004590), by TACR
project no. FW03010020, by the National Council for
Scientific and Technological Development – CNPq,
by the National Fund for Scientific and Technologi-
cal Development – FNDCT, and by the Ministry of
Science, Technology and Innovations – MCTI from
Brazil under research project No. 304551/2023-6 and
407334/2022-0, by the Paraiba State Research Sup-
port Foundation - FAPESQ under research project No.
3030/2021.
REFERENCES
Baca, T., Petrlik, M., Vrba, M., Spurny, V., Penicka, R.,
Hert, D., and Saska, M. (2021). The mrs uav system:
Pushing the frontiers of reproducible research, real-
world deployment, and education with autonomous
unmanned aerial vehicles. Journal of Intelligent &
Robotic Systems, 102:26.
Bazarevsky, V., Grishchenko, I., Raveendran, K., Zhu, T.,
Zhang, F., and Grundmann, M. (2020). Blazepose:
On-device real-time body pose tracking.
Celebi, S., Aydin, A. S., Temiz, T. T., and Arici, T. (2013).
Gesture recognition using skeleton data with weighted
dynamic time warping. volume 1, pages 620–625.
Chaudhary, A., Nascimento, T., and Saska, M. (2022). Con-
trolling a swarm of unmanned aerial vehicles using
full-body k-nearest neighbor based action classifier.
pages 544–551. IEEE.
Chen, C., Jafari, R., and Kehtarnavaz, N. (2015). Utd-mhad:
A multimodal dataset for human action recognition
utilizing a depth camera and a wearable inertial sen-
sor. pages 168–172. IEEE.
Gundogdu, K., Bayrakdar, S., and Yucedag, I. (2018). De-
veloping and modeling of voice control system for
prosthetic robot arm in medical systems. Journal of
King Saud University - Computer and Information
Sciences, 30:198–205.
Hert, D., Baca, T., Petracek, P., Kratky, V., Penicka, R.,
Spurny, V., Petrlik, M., Vrba, M., Zaitlik, D., Stoudek,
P., Walter, V., Stepan, P., Horyna, J., Pritzl, V.,
Sramek, M., Ahmad, A., Silano, G., Licea, D. B.,
Stibinger, P., Nascimento, T., and Saska, M. (2023).
Mrs drone: A modular platform for real-world deploy-
ment of aerial multi-robot systems. Journal of Intelli-
gent & Robotic Systems, 108:64.
Hert, D., Baca, T., Petracek, P., Kratky, V., Spurny, V., Petr-
lik, M., Vrba, M., Zaitlik, D., Stoudek, P., Walter, V.,
Stepan, P., Horyna, J., Pritzl, V., Silano, G., Licea,
D. B., Stibinger, P., Penicka, R., Nascimento, T., and
Saska, M. (2022). Mrs modular uav hardware plat-
forms for supporting research in real-world outdoor
and indoor environments. pages 1264–1273. IEEE.
Jiao, R., Wang, Z., Chu, R., Dong, M., Rong, Y., and Chou,
W. (2020). An intuitive end-to-end human-uav inter-
action system for field exploration. Frontiers in Neu-
rorobotics, 13.
Krusche, S., Al Naser, I., Bdiwi, M., and Ihlenfeldt, S.
(2023). A novel approach for automatic annotation
of human actions in 3d point clouds for flexible col-
laborative tasks with industrial robots. Frontiers in
Robotics and AI, 10.
Mohebbi, A. (2020). Human-robot interaction in rehabili-
tation and assistance: a review. Current Robotics Re-
ports, 1:131–144.
Mokhtarzadeh, A. A. and Yangqing, Z. J. (2018). Human-
robot interaction and self-driving cars safety integra-
tion of dispositif networks. pages 494–499. IEEE.
Park, S., Wang, X., Menassa, C. C., Kamat, V. R., and
Chai, J. Y. (2024). Natural language instructions for
intuitive human interaction with robotic assistants in
field construction work. Automation in Construction,
161:105345.
Rwigema, J., Choi, H. R., and Kim, T. (2019). A differential
evolution approach to optimize weights of dynamic
time warping for multi-sensor based gesture recogni-
tion. Sensors (Switzerland), 19.
Sathiyanarayanan, M., Mulling, T., and Nazir, B. (2015).
Controlling a robot using a wearable device (myo).
Schneider, P., Memmesheimer, R., Kramer, I., and Paulus,
D. (2019). Gesture recognition in rgb videos usinghu-
man body keypoints and dynamic time warping.
Vasconez, J. P., Kantor, G. A., and Cheein, F. A. A. (2019).
Human–robot interaction in agriculture: A survey and
current challenges. Biosystems Engineering, 179:35–
48.
Vysocky, A. and Novak, P. (2016). Human – robot collab-
oration in industry. MM Science Journal, 2016:903–
906.
Yamada, H., Muto, T., and Ohashi, G. (2015). Development
of a telerobotics system for construction robot using
virtual reality. pages 2975–2979. Institute of Electri-
cal and Electronics Engineers Inc.
Yoo, M., Na, Y., Song, H., Kim, G., Yun, J., Kim, S., Moon,
C., and Jo, K. (2022). Motion estimation and hand
gesture recognition-based human–uav interaction ap-
proach in real time. Sensors, 22:2513.
Zhang, J., Yu, Z., Wang, X., Lyu, Y., Mao, S., Periaswamy,
S. C., Patton, J., and Wang, X. (2018). Rfhui: An in-
tuitive and easy-to-operate human-uav interaction sys-
tem for controlling a uav in a 3d space. In Proceedings
of the 15th EAI International Conference on Mobile
and Ubiquitous Systems: Computing, Networking and
Services, MobiQuitous ’18, page 69–76, New York,
NY, USA. Association for Computing Machinery.
Intuitive Human-Robot Interface: A 3-Dimensional Action Recognition and UAV Collaboration Framework
35