Authors:
David Valiente
;
Arturo Gil Aparicio
;
Francisco Amorós Espí
and
Oscar Reinoso
Affiliation:
Miguel Hernández University, Spain
Keyword(s):
Visual SLAM, Omnidirectional Images, SGD.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robotics and Automation
Abstract:
This work presents a solution for the problem of Simultaneous Localization and Mapping (SLAM) based on a Stochastic Gradient Descent (SGD) technique and using omnidirectional images. In the field of applications of mobile robotics, SGD has never been tested with visual information obtained from the environment. This paper suggests the introduction of a SGD algorithm into a SLAM scheme which exploits the benefits of omnidirectional images provided by a single camera. Several improvements have been introduced to the vanilla SGD in order to adapt it to the case of omnidirectional observations. This new SGD approach reduces the undesired harmful effects provoked by non-linearities which compromise the convergence of the traditional filter estimates. Particularly, we rely on an efficient map representation, conformed by a reduced set of image views. The first contribution is the adaption of the basic SGD algorithm to work with omnidirectional observations, whose nature is angular, and thu
s it lacks of scale. Former SGD approaches only process one constraint independently at each iteration step. Instead, we think of a strategy which employes several constraints simultaneously as system inputs, with the purpose of improving the convergence speed when estimating a SLAM solution. In this context, we present different sets of experiments which have been carried out seeking for
validation of our new approach based on SGD with omnidirectional observations. In addition, we compare our approach with a basic SGD in order to prove the expected benefits in terms of efficiency.
(More)