GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality

Sebastian P. Kleinschmidt, Bernardo Wagner

Abstract

In this paper, a new virtual reality (VR) control concept for operating robots in search and rescue (SAR) scenarios is introduced. The presented approach intuitively provides different sensor signals as RGB, thermal and active infrared images by projecting them onto 3D structures generated by a Time of Flight (ToF)-based depth camera. The multichannel 3D data are displayed using an Oculus Rift head-up-display providing additional head tracking information. The usage of 3D structures can improve the perception of scale and depth by providing stereoscopic images which cannot be generated for stand-alone 2D images. Besides the described operating concept, the main contributions of this paper are the introduction of an hybrid calibration pattern for multi-sensor calibration and a high performance 2D-to-3D mapping procedure. To ensure low latencies, all steps of the algorithm are performed parallelly on a graphics processing unit (GPU) which reduces the traditional processing time on a central processing unit (CPU) by 80.03%. Furthermore, different input images are merged according to their importance for the operator to create a multi-sensor point cloud.

References

  1. Brown, D. C. (1971). Close-range camera calibration. In Photogrammetric Engineering, volume 37, pages 855-866.
  2. Cui, J., Tosunoglu, S., Roberts, R., Moore, C., and Repperger, W. (2003). A review of teleoperation system control. In Florida Conference on Recent Advances in Robotics (FCRAR), pages 1-12, Boca Raton, FL, USA.
  3. Endres, F., Hess, J., Sturm, J., Cremers, D., and Burgard, W. (2013). 3D mapping with an RGB-D camera. volume 30, pages 177-187.
  4. Gernert, B., Schildt, S., L.Wolf, Zeise, B., Fritsche, P., Wagner, B., M.Fiosins, Manesh, R., and Mueller, J. (2014). An interdisciplinary approach to autonomous team-based exploration in disaster scenarios. In IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 1-8, Hokkaido, Japan.
  5. Hainsworth, D. W. (2001). Teleoperation user interfaces for mining robotics. In Autonomous Robots, volume 11, pages 19-28, Hingham, MA, USA.
  6. Henry, P., Krainin, M., Herbst, E., Ren, X., and Fox, D. (2010). RGB-D mapping: Using depth cameras for dense 3d modeling of indoor environments. In RGBD: Advanced Reasoning with Depth Cameras Workshop in conjunction with RSS, Zaragoza, Spain.
  7. Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., Shotton, J., Hodges, S., Freeman, D., Davison, A., and Fitzgibbon, A. (2011). Kinectfusion: Real-time 3d reconstruction and interaction using a moving depth camera. In ACM Symposium on User Interface Software and Technology, pages 559- 268, Santa Barbara, CA, USA.
  8. LaValle, S. M., Yershova, A., Katsev, M., and Antonov, M. (2014). Head tracking for the oculus rift. In IEEE International Conference on Robotics and Automation (ICRA), pages 187 - 194, Hong Kong, China.
  9. Moghadam, P. and Vidas, S. (2014). Heatwave: The next generation of thermography devices. In International Society for Optical Engineering (SPIE), volume 9105, page 91050.
  10. Nguyen, L., Bualat, M., Edwards, L., Flueckiger, L., Neveu, C., Schwehr, K., Wagner, M., and Zbinden, E. (2001). Virtual reality interfaces for visualization and control of remote vehicles. Autonomous Robots, 11(1):59-68.
  11. Okura, F., Ueda, Y., Sato, T., and Yokoya, N. (2013). Teleoperation of mobile robots by generating augmented free-viewpoint images. In IEEE and RSJ International Conference on Intelligent Robots and Systems (IROS), pages 665-671, Tokyo, Japan.
  12. Ridao, P., Carreras, M., Hernandez, E., and Palomeras, N. (2007). Underwater telerobotics for collaborative research. In Ferre, M., Buss, M., Aracil, R., Melchiorri, C., and Balaguer, C., editors, Advances in Telerobotics, volume 31 of Springer Tracts in Advanced Robotics, pages 347-359. Springer Berlin Heidelberg.
  13. Saitoh, K., Machida, T., Kiyokawa, K., and Takemura, H. (2006). A 2D-3D integrated interface for mobile robot control using omnidirectional images and 3d geometric models. In IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), pages 173-176, Washington, DC, USA.
  14. Stoll, E., Wilde, M., and Pong, C. (2009). Using virtual reality for human-assisted in-space robotic assembly. In World Congress on Engineering and Computer Science, volume 2, San Francisco, USA.
  15. Tomasi, C. and Manduchi, R. (1998). Bilateral filtering for gray and color images. In IEEE International Conference on Computer Vision (ICCV), pages 839-846, Washington, DC, USA.
  16. Vidas, S. and Moghadam, P. (2013). Heatwave: A handheld 3D thermography system for energy auditing. In Energy and Buildings, volume 66, pages 445 - 460.
  17. Vidas, S., Moghadam, P., and Bosse, M. (2013). 3D thermal mapping of building interiors using an RGB-D and thermal camera. In IEEE International Conference on Robotics and Automation (ICRA), pages 2311-2318, Karlsruhe, Germany.
  18. Yong, L. S., Yang, W. H., and Jr, M. A. (1998). Robot task execution with telepresence using virtual reality technology. In International Conference on Mechatronic Technology, Hsinchu, Taiwan.
  19. Zeise, B., Kleinschmidt, S. P., and B.Wagner (2015). Improving the interpretation of thermal images with the aid of emissivity's angular dependency. In IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 1-8, West Lafayette, Indiana, USA.
  20. Zhang, Z. (1999). Flexible camera calibration by viewing a plane from unknown orientations. In IEEE International Conference on Computer Vision, volume 1, pages 666-673, Kerkyra, Greece.
  21. Zhang, Z. (2000). A flexible new technique for camera calibration. In IEEE Transactions on Pattern Analysis and Machine Intelligence, volume 22, pages 1330-1334.
Download


Paper Citation


in Harvard Style

Kleinschmidt S. and Wagner B. (2016). GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality . In Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 978-989-758-198-4, pages 19-29. DOI: 10.5220/0005692200190029


in Bibtex Style

@conference{icinco16,
author={Sebastian P. Kleinschmidt and Bernardo Wagner},
title={GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality},
booktitle={Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2016},
pages={19-29},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005692200190029},
isbn={978-989-758-198-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality
SN - 978-989-758-198-4
AU - Kleinschmidt S.
AU - Wagner B.
PY - 2016
SP - 19
EP - 29
DO - 10.5220/0005692200190029