Authors:
A. A. Mekonnen
1
;
F. Lerasle
1
and
I. Zuriarrain
2
Affiliations:
1
CNRS, LAAS, Université de Toulouse, UPS, INSA, INP, ISAE and LAAS, France
;
2
University of Mondragon and Goi Eskola Politeknikoa, Spain
Keyword(s):
Multi-person tracking, Multi-modal data fusion, MCMC particle filtering, Interactive robotics.
Related
Ontology
Subjects/Areas/Topics:
Active and Robot Vision
;
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Human-Computer Interaction
;
Methodologies and Methods
;
Motion and Tracking
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Physiological Computing Systems
;
Tracking of People and Surveillance
Abstract:
This paper addresses multi-modal person detection and tracking using a 2D SICK Laser Range Finder and a visual camera from a mobile robot in a crowded and cluttered environment. A sequential approach in which the laser data is segmented to filter human leg like structures to generate person hypothesis which are further refined by a state of the art parts based visual person detector for final detection, is proposed. Based on this detection routine, a Monte Carlo Markov Chain (MCMC) particle filtering strategy is utilized to track multiple persons around the robot. Integration of the implemented multi-modal person detector and tracker in our robotic platform and associated experiments are presented. Results obtained from all tests carried out have been clearly reported proving the multi-modal approach outperforms its single sensor counterparts taking detection, subsequent use, computation time, and precision into account. The work presented here will be used to define navigational con
trol laws for passer-by avoidance during a service robot’s person following activity.
(More)