Authors:
Adam Kalisz
1
;
Florian Particke
1
;
Dominik Penk
2
;
Markus Hiller
1
and
Jörn Thielecke
1
Affiliations:
1
Department of Electrical, Electronic and Communication Engineering, Information Technology (LIKE), Friedrich-Alexander-Universität Erlangen-Nürnberg, Am Wolfsmantel 33, Erlangen and Germany
;
2
Department of Computer Science, Computer Graphics Lab (LGDV), Friedrich-Alexander-Universität Erlangen-Nürnberg, Cauerstraße 11, Erlangen and Germany
Keyword(s):
Fusion, Global Positioning System, Visual Simultaneous Localization and Mapping, GPS, SLAM, Simulation, Blender.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Robotics
;
Software Engineering
;
Tracking and Visual Navigation
Abstract:
In order to account for sensor deficiancies, usually a multi-sensor approach is used where various sensors complement each other. However, synchronization of highly accurate Global Positioning System (GPS) and video measurements requires specialized hardware which is not straightforward to set up. This paper proposes a full simulation environment for data generation and evaluation of Visual Simultaneous Localization and Mapping (Visual SLAM) and GPS based on free and open software. Specifically, image data is created by rendering a virtual environment where camera effects such as Motion Blur and Rolling Shutter can be added. Consequently, a ground truth camera trajectory is available and can be distorted via additive Gaussian noise to understand all parameters involved in the use of fusion algorithms such as the Kalman Filter. The proposed evaluation framework will be published as open source online at https://master.kalisz.co for free use by the research community.