All in all, as expected we found that using car sig-
nals on vehicle-shaped mobile robots is a good idea
and a design recommendation for the robotics com-
munity. What we did not expect was that the less vis-
ible bottom lights were at least as important for sig-
naling as the top LED rings, which we expected to
be more informative. We also noticed a tendency of
people passing by an oncoming robot from the right
side, as in vehicular traffic. We might have expected
this to be more significant, which it was not, because
many participants chose to go on the left side of the
robot too. As future work it might be interesting to
compare this effect with populations where the driv-
ing is on the left-hand-side of streets. We will also
consider redesigning our robots lights and enforcing
the ones near the bottom of the robot, according to our
findings. Finally, based on the conclusions from this
controlled study on the most efficient ways to signal
turning intent, we plan to drive the robot on the uni-
versity hallways and collect data in this uncontrolled
environment and analyze how people flow around it
depending on turn signaling.
ACKNOWLEDGEMENTS
This work was supported by the project Health-CAT,
funded by the European Regional Development Fund.
REFERENCES
Ansuini, C., Giosa, L., Turella, L., Alto
`
e, G., and Castiello,
U. (2008). An object for an action, the same object for
other actions: effects on hand shaping. Experimental
Brain Research, 185(1):111–119.
Baraka, K. and Veloso, M. M. (2018). Mobile service robot
state revealing through expressive lights: Formalism,
design, and evaluation. International Journal of So-
cial Robotics, 10(1):65–92.
Beer, J. M., Prakash, A., Mitzner, T. L., and Rogers, W. A.
(2011). Understanding robot acceptance. Technical
report, Georgia Institute of Technology.
Bodenhagen, L., Suvei, S.-D., Juel, W. K., Brander, E.,
and Kr
¨
uger, N. (2019). Robot technology for future
welfare: meeting upcoming societal challenges – an
outlook with offset in the development in scandinavia.
Health and Technology, 9(3):197–218.
Castiello, U. (2003). Understanding other people’s ac-
tions: Intention and attention. Journal of experimen-
tal psychology. Human perception and performance,
29:416–30.
Chadalavada, R. T., Andreasson, H., Krug, R., and Lilien-
thal, A. J. (2015). That’s on my mind! robot to human
intention communication through on-board projection
on shared floor space. In 2015 European Conference
on Mobile Robots (ECMR), pages 1–6. IEEE.
Coovert, M. D., Lee, T., Shindev, I., and Sun, Y. (2014).
Spatial augmented reality as a method for a mobile
robot to communicate intended movement. Comput-
ers in Human Behavior, 34:241–248.
Ferreira Duarte, N., Tasevski, J., Coco, M., Rakovi
´
c, M.,
and Santos-Victor, J. (2018). Action anticipation:
Reading the intentions of humans and robots. IEEE
Robotics and Automation Letters, PP.
Gielniak, M. J. and Thomaz, A. L. (2011). Generating an-
ticipation in robot motion. In 2011 RO-MAN, pages
449–454.
Hameed, I., Tan, Z.-H., Thomsen, N., and Duan, X. (2016).
User acceptance of social robots. In The Ninth In-
ternational Conference on Advances in Computer-
Human Interactions.
Hart, J., Mirsky, R., Tejeda, S., Mahajan, B., Goo, J., Bal-
dauf, K., Owen, S., and Stone, P. (2019). Unclogging
our arteries: Using human-inspired signals to disam-
biguate navigational intentions.
Juel, W., Kr
¨
uger, N., and Bodenhagen, L. (2018). Robots
for elderly care institutions: How they may affect
elderly care. In Coeckelbergh, M., Loh, J., Funk,
M., Seibt, J., and Nørskov, M., editors, Envisioning
Robots in Society – Power, Politics, and Public Space,
pages 221–230. IOS Press.
Kilner, J. (2011). More than one pathway to action under-
standing. Trends in cognitive sciences, 15:352–7.
P
¨
ortner, A., Schroder, L., Rasch, R., Sprute, D., Hoffmann,
M., and Koenig, M. (2018). The power of color: A
study on the effective use of colored light in human-
robot interaction. In 2018 IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS),
pages 3395–3402.
Riek, L. D. (2017). Healthcare robotics. Commun. ACM,
60(11):68–78.
Schulz, T., Torresen, J., and Herstad, J. (2019). Anima-
tion techniques in human-robot interaction user stud-
ies: A systematic literature review. ACM Transactions
on Human-Robot Interaction (THRI), 8(2):12.
Sciutti, A., Ansuini, C., Becchio, C., and Sandini, G.
(2015). Investigating the ability to read others’ inten-
tions using humanoid robots. Frontiers in Psychology,
6.
Svenstrup, M., Tranberg, S., Andersen, H. J., and Bak, T.
(2009). Pose estimation and adaptive robot behaviour
for human-robot interaction. In 2009 IEEE Interna-
tional Conference on Robotics and Automation, pages
3571–3576.
Szafir, D., Mutlu, B., and Fong, T. (2015). Communicating
directionality in flying robots. 2015 10th ACM/IEEE
International Conference on Human-Robot Interac-
tion (HRI), pages 19–26.
HUCAPP 2020 - 4th International Conference on Human Computer Interaction Theory and Applications
74