socially aware engagement for human–robot first en-
counters. International Journal of Social Robotics,
13:1851 – 1877.
Br
¨
ohl, C., Nelles, J., Brandl, C., Mertens, A., and Nitsch,
V. (2019). Human-robot collaboration acceptance
model: Development and comparison for germany,
japan, china and the usa. I. J. Social Robotics, 11:709–
726.
Carvalho, M., Avelino, J., Bernardino, A., Ventura, R.
M. M., and Moreno, P. (2021). Human-robot greet-
ing: tracking human greeting mental states and acting
accordingly. 2021 IEEE/RSJ International Confer-
ence on Intelligent Robots and Systems (IROS), pages
1935–1941.
Chuah, S. H.-W. and Yu, J. (2021). The future of ser-
vice: The power of emotion in human-robot interac-
tion. Journal of Retailing and Consumer Services,
61:102551.
Dinges, L., Al-Hamadi, A., Hempel, T., and Al Aghbari,
Z. (2021). Using facial action recognition to evaluate
user perception in aggravated hrc scenarios. In 2021
12th International Symposium on Image and Signal
Processing and Analysis (ISPA), pages 195–199.
Dini, A., Murko, C., Yahyanejad, S., Augsd
¨
orfer, U., Hof-
baur, M., and Paletta, L. (2017). Measurement and
prediction of situation awareness in human-robot in-
teraction based on a framework of probabilistic atten-
tion. In 2017 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS), pages 4354–
4361.
Elprama, S., El Makrini, I., Vanderborght, B., and Jacobs,
A. (2016). Acceptance of collaborative robots by fac-
tory workers: a pilot study on the role of social cues
of anthropomorphic robots.
Finke, M., Koay, K. L., Dautenhahn, K., Nehaniv, C., Wal-
ters, M., and Saunders, J. (2005). Hey, i’m over here
- how can a robot attract people’s attention? In RO-
MAN 2005. IEEE International Workshop on Robot
and Human Interactive Communication, 2005., pages
7–12.
Fischer, K., Naik, L., Langedijk, R. M., Baumann, T.,
Jel
´
ınek, M., and Palinko, O. (2021). Initiating human-
robot interactions using incremental speech adapta-
tion. In Companion of the 2021 ACM/IEEE Interna-
tional Conference on Human-Robot Interaction, HRI
’21 Companion, page 421–425, New York, NY, USA.
Association for Computing Machinery.
Foster, M. E., Gaschler, A., and Giuliani, M. (2017). Au-
tomatically classifying user engagement for dynamic
multi-party human–robot interaction. International
Journal of Social Robotics, 9.
Hempel, T., Abdelrahman, A. A., and Al-Hamadi, A.
(2022). 6d rotation representation for unconstrained
head pose estimation. In 2022 IEEE International
Conference on Image Processing (ICIP), pages 2496–
2500.
Howard, A., Sandler, M., Chu, G., Chen, L.-C., Chen, B.,
Tan, M., Wang, W., Zhu, Y., Pang, R., Vasudevan, V.,
Le, Q. V., and Adam, H. (2019). Searching for mo-
bilenetv3. In Proceedings of the IEEE/CVF Interna-
tional Conference on Computer Vision (ICCV).
Kato, Y., Kanda, T., and Ishiguro, H. (2015). May i
help you? - design of human-like polite approach-
ing behavior-. In 2015 10th ACM/IEEE International
Conference on Human-Robot Interaction (HRI), pages
35–42.
Kendon, A. (1990). Conducting Interaction: Patterns of Be-
havior in Focused Encounters. Cambridge University
Press, Cambridge, U.K.
Miller, L., Kraus, J., Babel, F., and Baumann, M. (2021).
More than a feeling—interrelation of trust layers in
human-robot interaction and the role of user disposi-
tions and state anxiety. Frontiers in Psychology, 12.
Mollahosseini, A., Hasani, B., and Mahoor, M. H. (2017).
Affectnet: A database for facial expression, valence,
and arousal computing in the wild. IEEE Transactions
on Affective Computing, 10:18–31.
M
¨
uller-Abdelrazeq, S. L., Sch
¨
onefeld, K., Haberstroh, M.,
and Hees, F. (2019). Interacting with Collaborative
Robots—A Study on Attitudes and Acceptance in In-
dustrial Contexts, pages 101–117. Springer Interna-
tional Publishing, Cham.
Mumm, J. and Mutlu, B. (2011). Human-robot proxemics:
Physical and psychological distancing in human-robot
interaction. In 2011 6th ACM/IEEE International
Conference on Human-Robot Interaction (HRI), pages
331–338.
Naneva, S., Gou, M. S., Webb, T. L., and Prescott, T. J.
(2020). A systematic review of attitudes, anxiety, ac-
ceptance, and trust towards social robots. Interna-
tional Journal of Social Robotics, pages 1–23.
Oertel, C., Castellano, G., Chetouani, M., Nasir, J., Obaid,
M., Pelachaud, C., and Peters, C. (2020). Engagement
in human-agent interaction: An overview. Frontiers in
Robotics and AI, 7.
Repiso, E., Garrell, A., and Sanfeliu, A. (2018). Robot
approaching and engaging people in a human-robot
companion framework. In 2018 IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems
(IROS), pages 8200–8205.
Satake, S., Kanda, T., Glas, D. F., Imai, M., Ishiguro, H.,
and Hagita, N. (2009). How to approach humans?-
strategies for social robots to initiate interaction. In
2009 4th ACM/IEEE International Conference on
Human-Robot Interaction (HRI), pages 109–116.
Spezialetti, M., Placidi, G., and Rossi, S. (2020). Emotion
recognition for human-robot interaction: Recent ad-
vances and future perspectives. Frontiers in Robotics
and AI, 7.
Syrdal, D. S., Lee Koay, K., Walters, M. L., and Dauten-
hahn, K. (2007). A personalized robot companion?
- the role of individual differences on spatial prefer-
ences in hri scenarios. In RO-MAN 2007 - The 16th
IEEE International Symposium on Robot and Human
Interactive Communication, pages 1143–1148.
Vinkemeier, D., Valstar, M. F., and Gratch, J. (2018). Pre-
dicting folds in poker using action unit detectors and
decision trees. In FG, pages 504–511.
Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., and
Kissler, J. (2017). Mapping the emotional face. How
Sentiment-Based Engagement Strategies for Intuitive Human-Robot Interaction
685