Authors:
Rodrigo Lucas Santos
;
Mateus Silva
and
Ricardo Oliveira
Affiliation:
Departmento de Computação - DECOM, Universidade Federal de Ouro Preto - UFOP, Ouro Preto, Brazil
Keyword(s):
SLAM, Mobile Autonomous Robot, Data Fusion, RPLIDAR, RGB Camera.
Abstract:
Mobile autonomous robots require accurate maps to navigate and make informed decisions in real-time. The SLAM (Simultaneous Localization and Mapping) technique allows robots to build maps while they move. However, SLAM can be challenging in complex or dynamic environments. This study presents a mobile autonomous robot named Scramble, which uses SLAM based on the fusion of data from two sensors: a RPLIDAR A1m8 LiDAR and an RGB camera. How to improve the accuracy of mapping, trajectory planning, and obstacle detection of mobile autonomous robots using data fusion? In this paper, we show that the fusion of visual and depth data significantly improves the accuracy of mapping, trajectory planning, and obstacle detection of mobile autonomous robots. This study contributes to the advancement of autonomous robot navigation by introducing a data-fusion-based approach to SLAM. Mobile autonomous robots are used in a variety of applications, including package delivery, cleaning, and inspection.
The development of more robust and accurate SLAM algorithms is essential for the use of these robots in challenging environments.
(More)