Authors:
Caio Fischer Silva
1
;
Paulo V. K. Borges
2
and
José E. C. Castanho
3
Affiliations:
1
Robotics and Autonomous Systems Group, CSIRO, Australia, School of Engineering, São Paulo State University - UNESP, Bauru, SP and Brazil
;
2
Robotics and Autonomous Systems Group, CSIRO and Australia
;
3
School of Engineering, São Paulo State University - UNESP, Bauru, SP and Brazil
Keyword(s):
Environment-aware Sensor Fusion using Deep Learning.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Perception and Awareness
;
Robotics and Automation
Abstract:
A reliable perception pipeline is crucial to the operation of a safe and efficient autonomous vehicle. Fusing information from multiple sensors has become a common practice to increase robustness, given that different types of sensors have distinct sensing characteristics. Further, sensors can present diverse performance according to the operating environment. Most systems rely on a rigid sensor fusion strategy which considers the sensors input only (e.g., signal and corresponding covariances), without incorporating the influence of the environment, which often causes poor performance in mixed scenarios. In our approach, we have adjusted the sensor fusion strategy according to a classification of the scene around the vehicle. A convolutional neural network was employed to classify the environment, and this classification is used to select the best sensor configuration accordingly. We present experiments with a full-size autonomous vehicle operating in a heterogeneous environment. The
results illustrate the applicability of the method with enhanced odometry estimation when compared to a rigid sensor fusion scheme.
(More)