7 CONCLUSIONS
The PEAR framework presented in this paper aims
to respond to the rising need for prototyping and de-
sign tools associated with the emergence of the social
robots market. We showed that existing social robots
share common basic components and how those com-
ponents can be animated using 3D animation tools
and methods. This enables the rapid prototyping of
expressive animated robots as we have shown with
the prototyping of the Mia robot. We also showed
how this framework can be used to easily create an
animation tool for an existing robot, using Poppy as
an example.
Case studies also raised some limitations of the
current system. Animating a robot by mapping the
state of virtual objects onto its actuators works very
well for simple robots like Mia, with few degrees of
freedom and no possibilities of self-collision. Howe-
ver, more complex robots like Poppy would benefit
from features managing the robot physical constraints
for the user. (Nakaoka, 2012) showed for example an
animation tool integrating an automated balancing of
the robot. Our system could also be improved through
better input modalities. In our previous work (Balit
et al., 2016), we showed how a robot can be used as
a tangible interface for defining poses, thus avoinding
the reliance on inverse kinematics. We are looking
into ways of integrating those features to the presen-
ted system.
We believe that fostering a design community will
be essential for the developpement of social robots.
Having a common prototyping tool would be an im-
portant step in this direction. 3D animation tools have
large communities of animators and character desig-
ners whose skills could grealty improve the user ex-
perience of social robots. Our work extending a 3D
animation tools for robot prototyping is a first step
towards building a bridge between those artists and
social robots design.
ACKNOWLEDGEMENTS
This work was partly funded by the Agence
Nationale de la Recherche (ANR) under the project
Amiqual4Home ANR-11-EQPX-0002.
REFERENCES
Al Moubayed, S., Beskow, J., Skantze, G., and Granström,
B. (2012). Furhat: a back-projected human-like robot
head for multiparty human-machine interaction. Cog-
nitive behavioural systems, pages 114–130.
Balit, E., Vaufreydaz, D., and Reignier, P. (2016). Integra-
ting animation artists into the animation design of so-
cial robots: An open-source robot animation software.
In The Eleventh ACM/IEEE International Conference
on Human Robot Interaction, pages 417–418. IEEE
Press.
Beira, R., Lopes, M., Praça, M., Santos-Victor, J., Ber-
nardino, A., Metta, G., Becchi, F., and Saltarén, R.
(2006). Design of the robot-cub (icub) head. In Robo-
tics and Automation, 2006. ICRA 2006. Proceedings
2006 IEEE International Conference on, pages 94–
100. IEEE.
Breazeal, C. and Scassellati, B. (1999). How to build ro-
bots that make friends and influence people. In In-
telligent Robots and Systems, 1999. IROS’99. Procee-
dings. 1999 IEEE/RSJ International Conference on,
volume 2, pages 858–863. IEEE.
Brooks, A. G., Gray, J., Hoffman, G., Lockerd, A., Lee,
H., and Breazeal, C. (2004). Robot’s play: interactive
games with sociable machines. Computers in Enter-
tainment (CIE), 2(3):10–10.
Chao, C., Gielniak, M., Yoo, J. W., and Thomaz, A. L.
(2010). Interactive learning by demonstration with the
simon robot. In Proceedings of the 9th AAAI Confe-
rence on Enabling Intelligence Through Middleware,
pages 2–2. AAAI Press.
Duffy, B. R. (2003). Anthropomorphism and the social ro-
bot. Robotics and Autonomous Systems, 42(3):177 –
190. Socially Interactive Robots.
Fitzpatrick, R. J. (2012). Designing and constructing an
animatronic head capable of human motion program-
med using face-tracking software. PhD thesis, Wor-
cester Polytechnic Institute.
Fujita, M. (2001). Aibo: Toward the era of digital creatu-
res. The International Journal of Robotics Research,
20(10):781–794.
Gouaillier, D., Hugel, V., Blazevic, P., Kilner, C., Mon-
ceaux, J., Lafourcade, P., Marnier, B., Serre, J., and
Maisonnier, B. (2009). Mechatronic design of nao hu-
manoid. In Robotics and Automation, 2009. ICRA’09.
IEEE International Conference on, pages 769–774.
IEEE.
Hanson, D., Baurmann, S., Riccio, T., Margolin, R.,
Dockins, T., Tavares, M., and Carpenter, K. (2009).
Zeno: A cognitive character. In Ai magazine, and spe-
cial proc. of aaai national conference, chicago.
Hoffman, G. (2012). Dumb robots, smart phones: A case
study of music listening companionship. In RO-MAN,
2012 IEEE, pages 358–363. IEEE.
Hoffman, G. et al. (2007). Ensemble: fluency and embo-
diment for robots acting with humans. PhD thesis,
Massachusetts Institute of Technology.
Hoffman, G. and Ju, W. (2014). Designing robots with mo-
vement in mind. Journal of Human-Robot Interaction,
3(1):89–122.
Lapeyre, M., Rouanet, P., Grizou, J., Nguyen, S., Depraetre,
F., Le Falher, A., and Oudeyer, P.-Y. (2014). Poppy
PEAR: Prototyping Expressive Animated Robots - A Framework for Social Robot Prototyping
53