REFERENCES
Bochkovskiy, A., Wang, C.-Y., and Liao, H.-Y. M. (2020).
Yolov4: Optimal speed and accuracy of object detec-
tion.
Cai, C., Yang, S., Yan, P., Tian, J., Du, L., and Yang,
X. (2019). Real-time human-posture recognition for
human-drone interaction using monocular vision. In
Yu, H., Liu, J., Liu, L., Ju, Z., Liu, Y., and Zhou, D.,
editors, Intelligent Robotics and Applications, pages
203–216, Cham. Springer International Publishing.
Cao, Z., Hidalgo, G., Simon, T., Wei, S.-E., and Sheikh,
Y. (2021). Openpose: Realtime multi-person 2d pose
estimation using part affinity fields. IEEE Transac-
tions on Pattern Analysis and Machine Intelligence,
43(1):172–186.
Chen, Y., Tian, Y., and He, M. (2020). Monocular hu-
man pose estimation: A survey of deep learning-based
methods. Computer Vision and Image Understanding,
192:102897.
Eißfeldt, H., Vogelpohl, V., Stolz, M., Papenfuß, A., Biella,
M., Belz, J., and K
¨
ugler, D. (2020). The acceptance of
civil drones in germany. CEAS Aeronautical Journal,
11.
Kassab, M. A., Ahmed, M., Maher, A., and Zhang,
B. (2020). Real-time human-uav interaction: New
dataset and two novel gesture-based interacting sys-
tems. IEEE Access, 8:195030–195045.
Kedilioglu, O., Lieret, M., Schottenhamml, J., W
¨
urfl, T.,
Blank, A., Maier, A., and Franke, J. (2021). Rgb-d-
based human detection and segmentation for mobile
robot navigation in industrial environments. In VISI-
GRAPP (4: VISAPP), pages 219–226.
Koch, J., Wettach, J., Bloch, E., and Berns, K. (2007).
Indoor localisation of humans, objects, and mobile
robots with rfid infrastructure. In 7th International
Conference on Hybrid Intelligent Systems (HIS 2007),
pages 271–276.
Le, T.-L., Nguyen, M.-Q., and Nguyen, T.-T.-M. (2013).
Human posture recognition using human skeleton pro-
vided by kinect. In 2013 International Conference
on Computing, Management and Telecommunications
(ComManTel), pages 340–345.
Lidynia C., Philipsen R., Z. M. (2017). Droning on
about drones—acceptance of and perceived barriers to
drones in civil usage contexts. Savage-Knepshield P.,
Chen J. (eds) Advances in Human Factors in Robots
and Unmanned Systems. Advances in Intelligent Sys-
tems and Computing, 499.
Liu, C. and Szir
´
anyi, T. (2021). Real-time human detec-
tion and gesture recognition for on-board uav rescue.
Sensors, 21(6).
Maher, A., Li, C., Hu, H., and Zhang, B. (2017). Realtime
human-uav interaction using deep learning. In Zhou,
J., Wang, Y., Sun, Z., Xu, Y., Shen, L., Feng, J., Shan,
S., Qiao, Y., Guo, Z., and Yu, S., editors, Biometric
Recognition, pages 511–519, Cham. Springer Interna-
tional Publishing.
Medeiros, A. C. S., Ratsamee, P., Uranishi, Y., Mashita,
T., and Takemura, H. (2020). Human-drone interac-
tion: Using pointing gesture to define a target object.
In Kurosu, M., editor, Human-Computer Interaction.
Multimodal and Natural Interaction, pages 688–705,
Cham. Springer International Publishing.
Monajjemi, M., Bruce, J., Sadat, S. A., Wawerla, J., and
Vaughan, R. (2015). Uav, do you see me? establishing
mutual attention between an uninstrumented human
and an outdoor uav in flight. In 2015 IEEE/RSJ In-
ternational Conference on Intelligent Robots and Sys-
tems (IROS), pages 3614–3620.
Monajjemi, M., Mohaimenianpour, S., and Vaughan, R.
(2016). Uav, come to me: End-to-end, multi-scale sit-
uated hri with an uninstrumented human and a distant
uav. In 2016 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS), pages 4410–
4417.
Mosberger, R. and Andreasson, H. (2013). An inexpen-
sive monocular vision system for tracking humans in
industrial environments. In 2013 IEEE International
Conference on Robotics and Automation, pages 5850–
5857.
Pounds, P. E. I. and Deer, W. (2018). The safety rotor—an
electromechanical rotor safety system for drones.
IEEE Robotics and Automation Letters, 3(3):2561–
2568.
Sanna, A., Lamberti, F., Paravati, G., Ramirez, E. H., and
Manuri, F. (2012). A kinect-based natural interface for
quadrotor control. In Intelligent Technologies for In-
teractive Entertainment. 4th International ICST Con-
ference, INTETAIN 2011, Genova, Italy, May 25-27,
2011, Revised Selected Papers.
Tellaeche, A., Kildal, J., and Maurtua, I. (2018). A flexi-
ble system for gesture based human-robot interaction.
Procedia CIRP, 72:57–62. 51st CIRP Conference on
Manufacturing Systems.
Yu, Y., Wang, X., Zhong, Z., and Zhang, Y. (2017). Ros-
based uav control using hand gesture recognition. In
2017 29th Chinese Control And Decision Conference
(CCDC), pages 6795–6799.
Zhang, J., Peng, L., Feng, W., Ju, Z., and Liu, H. (2019a).
Human-agv interaction: Real-time gesture detection
using deep learning. In Yu, H., Liu, J., Liu, L., Ju,
Z., Liu, Y., and Zhou, D., editors, Intelligent Robotics
and Applications, pages 231–242, Cham. Springer In-
ternational Publishing.
Zhang, S., Liu, X., Yu, J., Zhang, L., and Zhou, X.
(2019b). Research on multi-modal interactive control
for quadrotor uav. In 2019 IEEE 16th International
Conference on Networking, Sensing and Control (IC-
NSC), pages 329–334.
VISAPP 2022 - 17th International Conference on Computer Vision Theory and Applications
838