REFERENCES
Agarwal, S., Mierle, K., and Team, T. C. S. (2022). Ceres
Solver.
Antonante, P., Tzoumas, V., Yang, H., and Carlone, L.
(2021). Outlier-robust estimation: Hardness, min-
imally tuned algorithms, and applications. IEEE
Transactions on Robotics, 38(1):281–301.
Barath, D., Noskova, J., Ivashechkin, M., and Matas, J.
(2020). MAGSAC++, a fast, reliable and accurate ro-
bust estimator. In Proceedings of the IEEE/CVF Con-
ference on Computer Vision and Pattern Recognition
(CVPR).
Basso, F., Menegatti, E., and Pretto, A. (2018). Robust in-
trinsic and extrinsic calibration of RGB-D cameras.
IEEE Transactions on Robotics, 34(5):1315–1332.
Black, M. J. and Anandan, P. (1996). The Robust Estima-
tion of Multiple Motions: Parametric and Piecewise-
Smooth Flow Fields. Computer Vision and Image Un-
derstanding, 63(1):75–104.
Bloesch, M., Omari, S., Hutter, M., and Siegwart, R.
(2015). Robust visual inertial odometry using a di-
rect EKF-based approach. In 2015 IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems
(IROS), pages 298–304.
Brunetto, N., Salti, S., Fioraio, N., Cavallari, T., and Ste-
fano, L. (2015). Fusion of inertial and visual mea-
surements for RGB-D slam on mobile devices. In
Proceedings of the IEEE International Conference on
Computer Vision Workshops, pages 1–9.
Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J.,
Omari, S., Achtelik, M. W., and Siegwart, R. (2016).
The EuRoC micro aerial vehicle datasets. The Inter-
national Journal of Robotics Research, 35(10):1157–
1163.
Campos, C., Elvira, R., Rodr
´
ıguez, J. J. G., Montiel, J. M.,
and Tard
´
os, J. D. (2021). OrbSLAM3: An accu-
rate open-source library for visual, visual–inertial,
and multimap slam. IEEE Transactions on Robotics,
37(6):1874–1890.
Chai, W., Chen, C., and Edwan, E. (2015). Enhanced indoor
navigation using fusion of IMU and RGB-D camera.
In International Conference on Computer Information
Systems and Industrial Applications, pages 547–549.
Atlantis Press.
Chang, Z., Meng, Y., Liu, W., Zhu, H., and Wang, L. (2022).
WiCapose: multi-modal fusion based transparent au-
thentication in mobile environments. Journal of Infor-
mation Security and Applications, 66:103130.
Chen, W., Shang, G., Ji, A., Zhou, C., Wang, X., Xu, C.,
Li, Z., and Hu, K. (2022). An Overview on Visual
SLAM: From Tradition to Semantic. Remote Sensing,
14(13).
Chghaf, M., Rodriguez, S., and Ouardi, A. E. (2022). Cam-
era, LiDAR and multi-modal SLAM systems for au-
tonomous ground vehicles: a survey. Journal of Intel-
ligent & Robotic Systems, 105(1):1–35.
Chow, J. C., Lichti, D. D., Hol, J. D., Bellusci, G., and
Luinge, H. (2014). IMU and multiple RGB-D camera
fusion for assisting indoor stop-and-go 3D terrestrial
laser scanning. Robotics, 3(3):247–280.
Chu, C. and Yang, S. (2020). Keyframe-based RGB-D
visual-inertial odometry and camera extrinsic calibra-
tion using Extended Kalman Filter. IEEE Sensors
Journal, 20(11):6130–6138.
Cioffi, G., Cieslewski, T., and Scaramuzza, D. (2022).
Continuous-time vs. discrete-time vision-based
SLAM: A comparative study. IEEE Robotics Autom.
Lett., 7(2):2399–2406.
Darwish, W., Li, W., Tang, S., and Chen, W. (2017a).
Coarse to fine global RGB-D frames registration for
precise indoor 3D model reconstruction. In 2017
International Conference on Localization and GNSS
(ICL-GNSS), pages 1–5. IEEE.
Darwish, W., Tang, S., Li, W., and Chen, W. (2017b). A
new calibration method for commercial RGB-D sen-
sors. Sensors, 17(6):1204.
Das, A., Elfring, J., and Dubbelman, G. (2021). Real-
time vehicle positioning and mapping using graph op-
timization. Sensors, 21(8):2815.
Forster, C., Carlone, L., Dellaert, F., and Scaramuzza,
D. (2016). On-manifold preintegration for real-
time visual–inertial odometry. IEEE Transactions on
Robotics, 33(1):1–21.
Geneva, P., Eckenhoff, K., Lee, W., Yang, Y., and Huang,
G. (2020). OpenVINS: a research platform for visual-
inertial estimation. In 2020 IEEE International Con-
ference on Robotics and Automation (ICRA), pages
4666–4672.
Guo, C. X. and Roumeliotis, S. I. (2013). IMU-RGBD cam-
era 3D pose estimation and extrinsic calibration: Ob-
servability analysis and consistency improvement. In
2013 IEEE International Conference on Robotics and
Automation, pages 2935–2942. IEEE.
Heyden, A. and Pollefeys, M. (2005). Multiple view geom-
etry. Emerging topics in computer vision, 90:180–189.
Huai, J., Zhuang, Y., Lin, Y., Jozkow, G., Yuan, Q., and
Chen, D. (2022). Continuous-time spatiotemporal
calibration of a rolling shutter camera-IMU system.
IEEE Sensors Journal, 22(8):7920–7930.
Huber, P. J. (1992). Robust estimation of a location param-
eter. In Breakthroughs in statistics, pages 492–518.
Springer.
Hug, D., Banninger, P., Alzugaray, I., and Chli, M.
(2022). Continuous-time stereo-inertial odometry.
IEEE Robotics and Automation Letters, pages 1–1.
Jung, K., Shin, S., and Myung, H. (2022). U-VIO: Tightly
Coupled UWB Visual Inertial Odometry for Robust
Localization. In International Conference on Robot
Intelligence Technology and Applications, pages 272–
283. Springer.
Laidlow, T., Bloesch, M., Li, W., and Leutenegger, S.
(2017). Dense RGB-D-inertial SLAM with map de-
formations. In 2017 IEEE/RSJ International Confer-
ence on Intelligent Robots and Systems (IROS), pages
6741–6748. IEEE.
Lee, J., Hanley, D., and Bretl, T. (2022). Extrinsic calibra-
tion of multiple inertial sensors from arbitrary trajec-
tories. IEEE Robotics and Automation Letters.
Robust RGB-D-IMU Calibration Method Applied to GPS-Aided Pose Estimation
93