Authors:
Burtin Gabriel
1
;
Bonnin Patrick
2
and
Malartre Florent
1
Affiliations:
1
4D-Virtualiz, 10 Allee Evariste Galois, Clermont-Ferrand and France
;
2
LISV, Universite de Versailles St Quentin, 10-12 Avenue de l’Europe Velizy and France
Keyword(s):
Line Segment Detection, Algorithm Optimization, Camera-lidar Sensor Fusion, Localization, Extended Kalman Filter.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robot Design, Development and Control
;
Robotics and Automation
;
Sensors Fusion
;
Signal Processing, Sensors, Systems Modeling and Control
;
Virtual Environment, Virtual and Augmented Reality
Abstract:
The objective of this work is to use efficiently various sensors to create a SLAM system. This algorithm has to be fast (real-time), computationally light and efficient enough to allow the robot to navigate in the environment. Because other processes embedded require large amount of cpu-time, our objective was to use efficiently complementary sensors to obtain a fairly accurate localization with minimal computation. To reach this, we used a combination of two sensors: a 2D lidar and a camera, mounted above each other on the robot and oriented toward the same direction. The objective is to pinpoint and cross features in the camera and lidar FOV. Our optimized algorithms are based on segments detection. We decided to observe intersections between vertical lines seen with the camera and locate them in 3D with the ranges provided by the 2D lidar. First we implemented a RGB vertical line detector using RGB gradient and linking process, then a lidar data segmentation with accelerated compu
tation and finally we used this feature detector in a Kalman filter. The final code is evaluated and validated using an advanced real-time robotic simulator and later confirmed with a real experiment.
(More)