LARGE SCALE LOCALIZATION - For Mobile Outdoor Augmented Reality Applications

I. M. Zendjebil, F. Ababsa, J-Y. Didier, M. Mallem

Abstract

In this paper, we present an original localization system for large scale outdoor environments which uses a markerless vision-based approach to estimate the camera pose. It relies on natural feature points extracted from images. Since this type of method is sensitive to brightness changes, occlusions and sudden motions which are likely to occur in outdoor environment, we use two more sensors to assist the vision process. In our work, we would like to demonstrate the feasibility of an assistance scheme in large scale outdoor environment. The intent is to provide a fallback system for the vision in case of failure as well as to reinitialize the vision system when needed. The complete localization system aims to be autonomous and adaptable to different situations. We present here an overview of our system, its performance and some results obtained from experiments performed in an outdoor environment under real conditions.

References

  1. Ababsa, F. (2009). Advanced 3d localization by fusing measurements from gps, inertial and vision sensors. In Systems, Man and Cybernetics, 2009. SMC 2009. IEEE International Conference on, pages 871 -875.
  2. Ababsa, F. and Mallem, M. (2007). Hybrid 3d camera pose estimation using particle filter sensor fusion. In Advanced Robotics, the International Journal of the Robotics Society of Japan (RSJ), pages 165-181.
  3. Aron, M., Simon, G., and Berger, M. (2007). Use of inertial sensors to support video tracking: Research articles. Comput. Animat. Virtual Worlds, 18(1):57-68.
  4. Azuma, R. (1993). Tracking requirements for augmented reality. Commun. ACM, 36(7):50-51.
  5. Bay, H., Ess, A., Tuytelaars, T., and Goo, L. V. (2008). Surf: Speeded up robust features. Computer Vision and Image Understanding (CVIU), 110(3):346-359.
  6. Bleser, G. (2009). Towards Visual-Inertial SLAM for Mobile Augmented Reality. PhD thesis, Technical University Kaiserslautern.
  7. Bleser, G. and Strickery, D. (2008). Using the marginalised particle filter for real-time visual-inertial sensor fusion. Mixed and Augmented Reality, IEEE / ACM International Symposium on, 0:3-12.
  8. Borenstein, J. and Feng, L. (1996). Gyrodometry: A new method for combining data from gyros and odom etry in mobile robots. In In Proceedings of the 1996 IEEE International Conference onRobotics and Automation, pages 423-428.
  9. Didier, J., Otmane, S., and Mallem, M. (2006). A component model for augmented/mixed reality applications with reconfigurable data-flow. In 8th International Conference on Virtual Reality (VRIC 2006), pages 243-252, Laval (France).
  10. Faugeras, O. and Toscani, G. (1987). Camera calibration for 3d computer vision. In Proc. Int'l Workshop Industrial Applications of Machine Vision and Machine Intelligence, pages 240-247.
  11. Fischler, M. A. and Bolles, R. C. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM, 24(6):381-395.
  12. Harris, C. (1993). Tracking with rigid models. Active vision, pages 59-73.
  13. Hol, J., Schon, T., Gustafsson, F., and Slycke, P. (2006). Sensor fusion for augmented reality. In Information Fusion, 2006 9th International Conference on, pages 1-6, Florence. IEEE.
  14. Lu, C.-P., Hager, G. D., and Mjolsness, E. (2000). Fast and globally convergent pose estimation from video images. IEEE Trans. Pattern Anal. Mach. Intell., 22(6):610-622.
  15. Lucas, B. and Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In IJCAI81, pages 674-679.
  16. Maidi, M., Ababsa, F., and Mallem, M. (2005). Visioninertial system calibration for tracking in augmented reality. In 2nd International Conference on Informatics in Control, Automation and Robotics, pages 156- 162.
  17. Reitmayr, G. and Drummond, T. (2006). Going out: Robust model-based tracking for outdoor augmented reality. In IEEE ISMAR, Santa Barbara, California, USA.
  18. Reitmayr, G. and Drummond, T. (2007). Initialisation for visual tracking in urban environments. In IEEE ISMAR, Nara, Japan.
  19. Ribo, M., Lang, P., Ganster, H., Brandner, M., Stock, C., and Pinz, A. (2002). Hybrid tracking for outdoor augmented reality applications. IEEE Comput. Graph. Appl., 22(6):54-63.
  20. Schall, G., Wagner, D., Reitmayr, G., Taichmann, E., Wieser, M., Schmalstieg, D., and Wellenhof, B. H. (2009). Global pose estimation using multi-sensor fusion for outdoor augmented reality. In In Proceedings of IEEE Int. Symposium on Mixed and Augmented Reality 2009, Orlando, Florida, USA.
  21. Viéville, T., Romann, F., Hotz, B., Mathieu, H., Buffa, M., Robert, L., Facao, P., Faugeras, O., and Audren, J. (1993). Autonomous navigation of a mobile robot using inertial and visual cues. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems.
  22. Williams, C. (1997). Prediction with gaussian processes: From linear regression to linear prediction and beyond. Technical report, Neural Computing Research Grou.
  23. You, S., Neumann, U., and Azuma, R. (1999). Orientation tracking for outdoor augmented reality registration. IEEE Computer Graphics and Applications, 19(6):36-42.
  24. Zendjebil, I., Ababsa, F., Didier, J.-Y., and et M. Mallem (2010). A gps-imu-camera modelization and calibration for 3d localization dedicated to outdoor mobile applications. In International Conference On Control, Automation and system.
  25. Zendjebil, I. M., Ababsa, F., Didier, J.-Y., and Mallem, M. (2008). On the hybrid aid-localization for outdoor augmented reality applications. In VRST 7808: Proceedings of the 2008 ACM symposium on Virtual reality software and technology, pages 249-250, New York, NY, USA. ACM.
Download


Paper Citation


in Harvard Style

M. Zendjebil I., Ababsa F., Didier J. and Mallem M. (2011). LARGE SCALE LOCALIZATION - For Mobile Outdoor Augmented Reality Applications . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011) ISBN 978-989-8425-47-8, pages 492-501. DOI: 10.5220/0003364404920501


in Bibtex Style

@conference{visapp11,
author={I. M. Zendjebil and F. Ababsa and J-Y. Didier and M. Mallem},
title={LARGE SCALE LOCALIZATION - For Mobile Outdoor Augmented Reality Applications},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011)},
year={2011},
pages={492-501},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003364404920501},
isbn={978-989-8425-47-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011)
TI - LARGE SCALE LOCALIZATION - For Mobile Outdoor Augmented Reality Applications
SN - 978-989-8425-47-8
AU - M. Zendjebil I.
AU - Ababsa F.
AU - Didier J.
AU - Mallem M.
PY - 2011
SP - 492
EP - 501
DO - 10.5220/0003364404920501