Furthermore, by replacing the ROS framework
with other means for communication our approach
may also be applied to other application areas where
a multi-agent system should use a common human-
machine interface to communicate with its users.
Other modalities could also be developed in
addition to the speech interface. By extending the
functionality of the Voice Agent other input methods
like text input with autosuggest function, gesture
recognition or simple menu-based interfaces could
also be implemented. Such additional modalities can
amplify the disambiguation capabilities and improve
the efficiency of the communication (Green 2009,
Breuer 2012, Fardana 2013).
ACKNOWLEDGEMENTS
This work was supported in part by ARTEMIS Joint
Undertaking and in part by the Hungarian National
Research, Development and Innovation Fund within
the framework of the Reconfigurable ROS-based
Resilient Reasoning Robotic Cooperating Systems
(R5-COP) Project.
The prototype application was developed by a
group of researchers including István Engedy, Péter
Eredics and Péter Györke at our department.
REFERENCES
Bastianelli E., G. Castellucci, D. Croce, R. Basili, D. Nardi,
2014. Effective and Robust Natural Language
Understanding for Human obot Interaction, ECAI 2014,
T. Schaub et al. (Eds.).
Breuer, Thomas, et al. 2012. Johnny: An autonomous
service robot for domestic environments. Journal of
intelligent & robotic systems 66.1-2 (2012): 245-272.
Burgard W., A.B. Cremers, D. Fox, D. Hähnel, G.
Lakemeyer, D. Schulz, W. Steiner, S. Thrun, 1999.
Experiences with an interactive museum tour-guide
robot, Artificial Intelligence 114 (1999) 3–55.
Fardana A.R., S. Jain, I. Jovancevic, Y. Suri, C. Morand and
N.M. Robertson, 2013. Controlling a Mobile Robot
with Natural Commands based on Voice and Gesture,
Workshop on Human Robot Interaction (HRI) for
Assistance and Industrial Robots, VisionLab, Heriot-
Watt University, 2013.
Ferland F., R. Chauvin, D. Létourneau, F. Michaud, 2014.
Hello Robot, Can You Come Here? Using ROS4iOS to
Provide Remote Perceptual Capabilities for Visual
Location, Speech and Speaker Recognition, Proc. HRI
'14 the 2014 ACM/IEEE Int. Conf. on Human-Robot
Interaction, Pages 101-101.
Fong T., I. Nourbakhsh, K. Dautenhahn, 2003. A survey of
socially interactive robots, Robotics and Autonomous
Systems 42 (2003) 143–166.
Green A., 2009. Design and Evaluating Human-Robot
Communication, PhD, KTH, 2009.
H2020 – Robotics 2020 – Multi-Annual Roadmap For
Robotics in Europe - http://www.eu-robotics.net/
Howard T.M., I. Chung, O. Propp, M.R. Walter, and N. Roy,
2014. Efficient Natural Language Interfaces for
Assistive Robots, IROS 2014 Workshop on
Rehabilitation and Assistive Robotics, Sept 14-18,
2014, Chicago.
Huber A., Bernd L. 2002. Users Talk to their Model Trains:
Interaction with a Speech-based Multi-Agent System.
Proceedings of the First International Joint Conference
on Autonomous Agents and Multi-Agent Systems,
AAMAS 2002, July 15-19, Bologna, Italy,
Kemke C., 2007. “From Saying to Doing” – Natural
Language Interaction with Artificial Agents and
Robots, Ch 9, Human-Robot Interaction, (ed.) N.
Sarkar, Sept 2007, Itech Education and Publishing,
Vienna, Austria.
Khayrallah H., S. Trott, J. Feldman, 2015. Natural
Language For Human Robot Interaction, Proc. of the
Workshop on Human-Robot Teaming at the 10th
ACM/IEEE Int. Conf. on Human-Robot Interaction,
Portland, Oregon.
Lauria S., T. Kyriacou, G. Bugmann, J. Bos, E. Klein, 2002.
Converting Natural Language Route Instructions into
Robot Executable Procedures, Proc. 11th IEEE Int.
Work-shop on Robot and Human Interactive
Communication, pp. 223-228.
MacMahon M., B. Stankiewicz, B. Kuipers, 2006. Walk the
Talk: Connecting Language, Knowledge, and Action in
Route Instructions, Proc. AAAI'06 the 21st Nat. Conf.
on Artificial Intelli. - Vol 2, pp. 1475-1482.
R5COP 2016, http://r5cop.mit.bme.hu/
Rousseau V., F. Ferland, D. Letourneau, F. Michaud, 2013.
Sorry to Interrupt, But May I Have Your Attention?
Preliminary Design and Evaluation of Autonomous
Engagement in HRI, J. of Human-Robot Interaction,
Vol. 2, No. 3, 2013, pp. 41–61.
Schiffer S., N. Hoppe, and G. Lakemeyer, 2012. Natural
Language Interpretation for an Interactive Service
Robot in Domestic Domains, J. Filipe and A. Fred
(Eds.): ICAART 2012, CCIS 358, pp. 39–53, 2012.
Stenmark M., J. Malec, 2015. Connecting natural language
to task demonstrations and low-level control of
industrial robots, Workshop on Multimodal Semantics
for Robotic Systems (MuSRobS) IEEE/RSJ Int. Conf. on
Intelligent Robots and Systems 2015.
Tellex S., T. Kollar, S. Dickerson, M.R.Walter, A. Gopal
Banerjee, S. Teller, N. Roy, 2011. Understanding
Natural Language Commands for Robotic Navigation
and Mobile Manipulation, Proc. of the Nat. Conf. on
Artificial Intelli. (AAAI 2011).
Wyner, Adam, et al. 2009. On controlled natural languages:
Properties and prospects. International Workshop on
Controlled Natural Language. Springer Berlin
Heidelberg, 2009.
Agent-based Reconfigurable Natural Language Interface to Robots - Human-Agent Interaction using Task-specific Controlled Natural