Authors:
Bruno Lima
1
;
Givanildo L. N. Júnior
1
;
Lucas Amaral
1
;
Thales Vieira
2
;
Bruno Ferreira
1
and
Tiago Vieira
1
Affiliations:
1
Institute of Computing, Federal University of Alagoas, Maceió and Brazil
;
2
Institute of Mathematics, Federal University of Alagoas, Maceió and Brazil
Keyword(s):
Human-robot Interaction, Deep Learning, Convolutional Neural Networks, Skeleton Tracking.
Related
Ontology
Subjects/Areas/Topics:
Active and Robot Vision
;
Applications
;
Applications and Services
;
Computer Vision, Visualization and Computer Graphics
;
Enterprise Information Systems
;
Human and Computer Interaction
;
Human-Computer Interaction
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Robotics
;
Software Engineering
Abstract:
We present a human-robot natural interaction approach based on teleoperation through body gestures. More specifically, we propose an interface where the user can use his hand to intuitively control the position and status (open/closed) of a robotic arm gripper. In this work, we employ a 6-DOF (six degrees-of-freedom) industrial manipulator which mimics user movements in real-time, positioning the end effector as if the individual was looking into a mirror, entailing a natural and intuitive interface. The controlling hand of the user is tracked using body skeletons acquired from a Microsoft Kinect sensor, while a Convolutional Neural Network recognizes whether the hand is opened or closed using depth data. The network was trained on hand images collected from several individuals, in different orientations, resulting in a robust classifier that performs well regardless of user location or orientation. There is no need for wearable devices, such as gloves or wristbands. We present resul
ts of experiments that reveal high performance of the proposed approach to recognize both the user hand position and its status (open/closed); and experiments to demonstrate the robustness and applicability of the proposed approach to industrial tasks.
(More)