Authors:
Madjid Maidi
;
Fakhr-Eddine Ababsa
and
Malik Mallem
Affiliation:
IBISC Laboratory / CNRS FRE 2873, University of Evry, France
Keyword(s):
Computer vision, augmented reality, calibration, pose estimation, robust tracking.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Robotics and Automation
;
Virtual Environment, Virtual and Augmented Reality
;
Vision, Recognition and Reconstruction
Abstract:
In this paper, we present a robust fiducials tracking method for real time Augmented Reality systems. Our approach identifies the target object with an internal barecode of the fiducial and extracts its 2D features points. Given the 2D feature points and a 3D object model, object pose consists in recovering the position and the orientation of the object with respect to the camera. For pose estimation, we presented two methods for recovering pose using the Extended Kalman Filter and the Orthogonal Iteration algorithm. The first algorithm is a sequential estimator that predicts and corrects the state vector. While the later uses the object space collinearity error and derives an iterative algorithm to compute orthogonal rotation matrices. Due to lighting or contrast conditions or occlusion of the target object by an other object, the tracking may fail. Therefore, we extend our tracking method using a RANSAC algorithm to deal with occlusions. The algorithm is tested with different camer
a viewpoints under various image conditions and shows to be accurate and robust.
(More)