Authors:
Jessica Combier
1
;
Bertrand Vandeportaele
2
and
Patrick Danès
2
Affiliations:
1
ESSILOR International, Université de Toulouse, CNRS and UPS, France
;
2
Université de Toulouse, CNRS and UPS, France
Keyword(s):
Augmented Reality Head Mounted Device, Image Base Rendering, Fish-eye Stereo-vision System, SLAM, Gaze Interaction.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Image Formation and Preprocessing
;
Image Formation, Acquisition Devices and Sensors
;
Image Generation Pipeline: Algorithms and Techniques
Abstract:
An Augmented Reality prototype is presented. Its hardware architecture is composed of a Head Mounted
Display, a wide Field of View (FOV) stereo-vision passive system, a gaze tracker and a laptop. An associated
software architecture is proposed to immerse the user in augmented environments where he/she can move
freely. The system maps the unknown real-world (indoor or outdoor) environment and is localized into this
map by means of binocular state-of-the-art Simultaneous Localization and Mapping techniques. It overcomes
the FOV limitations of conventional augmented reality devices by using wide-angle cameras and associated
algorithms. It also solves the parallax issue induced by the distinct locations of the two cameras and of the
user’s eyes by using Depth Image Based Rendering. An embedded gaze tracker, together with environment
modeling techniques, enable gaze controlled interaction. A simple application is presented, in which a virtual
object is inserted into the user’s FOV and fol
lows his/her gaze. While the targeted real time performance has
not yet been achieved, the paper discusses ways to improve both frame rate and latency. Other future works
are also overviewed.
(More)