Authors:
Séverine Dubuisson
1
;
Myriam Robert-Seidowsky
2
and
Jonathan Fabrizio
2
Affiliations:
1
CNRS, UMR 7222 and ISIR, France
;
2
LRDE-EPITA, France
Keyword(s):
Visual Tracking, Particle Filter, Likelihood Function, Correction Step.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Motion, Tracking and Stereo Vision
;
Tracking and Visual Navigation
Abstract:
The particle filter is known to be efficient for visual tracking. However, its parameters are empirically fixed,
depending on the target application, the video sequences and the context. In this paper, we introduce a new
algorithm which automatically adjusts online two majors of them: the correction and the propagation parameters.
Our purpose is to determine, for each frame of a video, the optimal value of the correction parameter
and to adjust the propagation one to improve the tracking performance. On one hand, our experimental results
show that the common settings of particle filter are sub-optimal. On another hand, we prove that our approach
achieves a lower tracking error without needing to tune these parameters. Our adaptive method allows to
track objects in complex conditions (illumination changes, cluttered background, etc.) without adding any
computational cost compared to the common usage with fixed parameters.