Authors:
O. Ait Aider
;
G. Blanc
;
Y. Mezouar
and
P. Martinet
Affiliation:
LASMEA, UBP Clermont II, CNRS - UMR6602, France
Keyword(s):
Mobile robot, autonomous navigation, pattern tracking, visual servoing.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robotics and Automation
Abstract:
The paper describes a complete framework for autonomous environment mapping, localization and navigation using exclusively monocular vision. The environment map is a mosaic of 2D patterns detected on the ceiling plane and used as natural landmarks. The robot is able to localize itself and to reproduce learned trajectories defined by a set of key images representing the visual memory. A specific multiple 2D pattern tracker was developed for the application. It is based on particle filtering and uses both image contours and gray scale level variations to track efficiently 2D patterns even on cluttered ceiling appearance. When running autonomously, the robot is controlled by a visual servoing law adapted to its nonholonomic constraint. Based on the regulation of successive homographies, this control law guides the robot along the reference visual route without explicitly planning any trajectory. Real experiment results illustrate the validity of the presented framework.