ACKNOWLEDGEMENTS
This work is supported by the RAXENV project
funded by the French National Research Agency
”ANR”.
REFERENCES
Ababsa, F. (2009). Advanced 3d localization by fusing
measurements from gps, inertial and vision sensors.
In Systems, Man and Cybernetics, 2009. SMC 2009.
IEEE International Conference on, pages 871 –875.
Ababsa, F. and Mallem, M. (2007). Hybrid 3d camera
pose estimation using particle filter sensor fusion. In
Advanced Robotics, the International Journal of the
Robotics Society of Japan (RSJ), pages 165–181.
Aron, M., Simon, G., and Berger, M. (2007). Use of inertial
sensors to support video tracking: Research articles.
Comput. Animat. Virtual Worlds, 18(1):57–68.
Azuma, R. (1993). Tracking requirements for augmented
reality. Commun. ACM, 36(7):50–51.
Bay, H., Ess, A., Tuytelaars, T., and Goo, L. V. (2008). Surf:
Speeded up robust features. Computer Vision and Im-
age Understanding (CVIU), 110(3):346–359.
Bleser, G. (2009). Towards Visual-Inertial SLAM for Mo-
bile Augmented Reality. PhD thesis, Technical Uni-
versity Kaiserslautern.
Bleser, G. and Strickery, D. (2008). Using the marginalised
particle filter for real-time visual-inertial sensor fu-
sion. Mixed and Augmented Reality, IEEE / ACM In-
ternational Symposium on, 0:3–12.
Borenstein, J. and Feng, L. (1996). Gyrodometry: A new
method for combining data from gyros and odom etry
in mobile robots. In In Proceedings of the 1996
IEEE International Conference onRobotics and Au-
tomation, pages 423–428.
Didier, J., Otmane, S., and Mallem, M. (2006). A compo-
nent model for augmented/mixed reality applications
with reconfigurable data-flow. In 8th International
Conference on Virtual Reality (VRIC 2006), pages
243–252, Laval (France).
Faugeras, O. and Toscani, G. (1987). Camera calibration
for 3d computer vision. In Proc. Int’l Workshop In-
dustrial Applications of Machine Vision and Machine
Intelligence, pages 240–247.
Fischler, M. A. and Bolles, R. C. (1981). Random sample
consensus: a paradigm for model fitting with appli-
cations to image analysis and automated cartography.
Commun. ACM, 24(6):381–395.
Harris, C. (1993). Tracking with rigid models. Active vi-
sion, pages 59–73.
Hol, J., Schon, T., Gustafsson, F., and Slycke, P. (2006).
Sensor fusion for augmented reality. In Information
Fusion, 2006 9th International Conference on, pages
1–6, Florence. IEEE.
Lu, C.-P., Hager, G. D., and Mjolsness, E. (2000). Fast
and globally convergent pose estimation from video
images. IEEE Trans. Pattern Anal. Mach. Intell.,
22(6):610–622.
Lucas, B. and Kanade, T. (1981). An iterative image regis-
tration technique with an application to stereo vision.
In IJCAI81, pages 674–679.
Maidi, M., Ababsa, F., and Mallem, M. (2005). Vision-
inertial system calibration for tracking in augmented
reality. In 2nd International Conference on Informat-
ics in Control, Automation and Robotics, pages 156–
162.
Reitmayr, G. and Drummond, T. (2006). Going out: Robust
model-based tracking for outdoor augmented reality.
In IEEE ISMAR, Santa Barbara, California, USA.
Reitmayr, G. and Drummond, T. (2007). Initialisation for
visual tracking in urban environments. In IEEE IS-
MAR, Nara, Japan.
Ribo, M., Lang, P., Ganster, H., Brandner, M., Stock, C.,
and Pinz, A. (2002). Hybrid tracking for outdoor aug-
mented reality applications. IEEE Comput. Graph.
Appl., 22(6):54–63.
Schall, G., Wagner, D., Reitmayr, G., Taichmann, E.,
Wieser, M., Schmalstieg, D., and Wellenhof, B. H.
(2009). Global pose estimation using multi-sensor fu-
sion for outdoor augmented reality. In In Proceedings
of IEEE Int. Symposium on Mixed and Augmented Re-
ality 2009, Orlando, Florida, USA.
Vi
´
eville, T., Romann, F., Hotz, B., Mathieu, H., Buffa, M.,
Robert, L., Facao, P., Faugeras, O., and Audren, J.
(1993). Autonomous navigation of a mobile robot
using inertial and visual cues. In Proceedings of the
IEEE International Conference on Intelligent Robots
and Systems.
Williams, C. (1997). Prediction with gaussian processes:
From linear regression to linear prediction and be-
yond. Technical report, Neural Computing Research
Grou.
You, S., Neumann, U., and Azuma, R. (1999). Orien-
tation tracking for outdoor augmented reality regis-
tration. IEEE Computer Graphics and Applications,
19(6):36–42.
Zendjebil, I., Ababsa, F., Didier, J.-Y., and et M. Mallem
(2010). A gps-imu-camera modelization and calibra-
tion for 3d localization dedicated to outdoor mobile
applications. In International Conference On Control,
Automation and system.
Zendjebil, I. M., Ababsa, F., Didier, J.-Y., and Mallem, M.
(2008). On the hybrid aid-localization for outdoor
augmented reality applications. In VRST ’08: Pro-
ceedings of the 2008 ACM symposium on Virtual re-
ality software and technology, pages 249–250, New
York, NY, USA. ACM.
LARGE SCALE LOCALIZATION - For Mobile Outdoor Augmented Reality Applications
501