Table 2: Mean frame rates achieved for the MCT and FPCT
individually, and for our fusion filter updated from one or
the other cue.
Tracker
frame rate [Hz]
MCT 26.9
FPCT (1000 particles) 19.1
Fusion (update with MC) 23.8
Fusion (update with FPC) 17.9
5 CONCLUSION
We have presented a combination of video trackers
within a particle filter framework. The filter uses two
cues provided by a marker-based approach and a fea-
ture point-based one. The motion model is adapted
online according to the distance between past esti-
mates.
Experiments show that the proposed combination
produces a synergy. The system tolerates occlusions
and changes of illumination. Independent adaptive
tuning for the model of each DoF demonstrates su-
perior performance in front of manoeuvres.
In our future research, we will focus on extending
the FPC to feature points beyond the four corners of
the marker and enhancing their viewpoint sensibility.
ACKNOWLEDGEMENTS
The first author is supported by the Swiss National
Science Foundation (grant number 200021-113827),
and by the European Networks of Excellence K-
SPACE and VISNET2.
REFERENCES
Allen, B., Bishop, G., and Welch, G. (2001). Tracking:
Beyond 15 minutes of thought. In SIGGRAPH.
Arulampalam, M., Maskell, S., Gordon, N., and Clapp,
T. (2002). A tutorial on particle filters for on-
line nonlinear/non-gaussian bayesian track ing. IEEE
Trans. on Signal Processing, 50(2):174–188.
Azuma, R. and Bishop, G. (1995). Improving static and
dynamic registration in an optical see-through HMD.
In SIGGRAPH, pages 197–204. ACM Press.
Chai, L., Nguyen, K., Hoff, B., and Vincent, T. (1999). An
adaptive estimator for registration in augmented real-
ity. In IWAR, pages 23–32.
Claus, D. and Fitzgibbon, A. (2004). Reliable fiducial de-
tection in natural scenes. In European Conference on
Computer Vision (ECCV), volume 3024, pages 469–
480. Springer-Verlag.
Davison, A. (2003). Real-time simultaneous localisation
and mapping with a single camera. In ICCV.
Fiala, M. (2005). ARTag, a fiducial marker system using
digital techniques. In CVPR, volume 2, pages 590–
596.
Ichimura, N. (2002). Stochastic filtering for motion trajec-
tory in image sequences using a monte carlo filter with
estimation of hyper-parameters. In ICPR, volume 4,
pages 68–73.
Kanbara, M., Fujii, H., Takemura, H., and Yokoya, N.
(2000). A stereo vision-based augmented reality sys-
tem with an inertial sensor. In ISAR, pages 97–100.
Kato, H. and Billinghurst, M. (1999). Marker tracking and
HMD calibration for a video-based augmented reality
conferencing system. In IWAR, pages 85–94.
Koller, D., Klinker, G., Rose, E., Breen, D., Wihtaker,
R., and Tuceryan, M. (1997). Real-time vision-based
camera tracking for augmented reality applications. In
ACM Virtual Reality Software and Technology.
Maybeck, P. S. (1982). Stochastic Models, Estimation, and
Control, volume 141-2, chapter Parameter uncertain-
ties and adaptive estimation, pages 68–158. Academic
Press.
Najafi, H., Navab, N., and Klinker, G. (2004). Automated
initialization for marker-less tracking: a sensor fusion
approach. In ISMAR, pages 79–88.
Okuma, T., Kurata, T., and Sakaue, K. (2003). Fiducial-
less 3-d object tracking in ar systems based on the in-
tegration of top-down and bottom-up approaches and
automatic database addition. In ISMAR, page 260.
Pupilli, M. and Calway, A. (2005). Real-time camera track-
ing using a particle filter. In British Machine Vision
Conference, pages 519–528. BMVA Press.
Satoh, K., Uchiyama, S., Yamamoto, H., and Tamura,
H. (2003). Robot vision-based registration utilizing
bird’s-eye view with user’s view. In ISMAR, pages
46–55.
Vacchetti, L., Lepetit, V., and Fua, P. (2004). Combining
edge and texture information for real-time accurate 3d
camera tracking. In ISMAR, Arlington, VA.
Xu, X. and Li, B. (2006). Rao-blackwellised particle fil-
ter with adaptive system noise and its evaluation for
tracking in surveillance. In Visual Communications
and Image Processing (VCIP). SPIE.
You, S. and Neumann, U. (2001). Fusion of vision and gyro
tracking for robust augmented reality registration. In
IEEE Virtual Reality (VR), pages 71–78.
You, S., Neumann, U., and Azuma, R. (1999). Hybrid iner-
tial and vision tracking for augmented reality registra-
tion. In IEEE Virtual Reality (VR), pages 260–267.
Zhang, X., Fronz, S., and Navab, N. (2002). Visual marker
detection and decoding in AR systems: A comparative
study. In ISMAR, pages 97–106.
VISAPP 2007 - International Conference on Computer Vision Theory and Applications
370