Table 2: Stability (variance of tracking errors over 10 runs)
obtained with fixed and adaptive α on several video se-
quences. The lowest in bold.
Our α = α = α = α = α =
Sequence α adapt.
1
√
2d
min
20 50 100 200
CarDark 0.4 12.8 23.5 20.0 22.7 18.9
Fish 0.2 0.6 0.7 0.3 0.4 0.4
Matrix 7.9 14.6 15.3 12.7 18.8 9.5
Couple 1.2 3.1 2.5 1.5 2.4 2.0
CarScale 4.3 7.1 6.3 5.6 5.1 7.2
Football 2.2 29.8 93.5 41.1 34.8 43.9
Dog1 2.3 5.1 3.9 4.2 3.9 10.2
ticle filter, the most time consuming step is the correc-
tion one, more precisely the computation of distances
between particles and model. This step is done once,
for fixed or adaptive α. Our algorithm is just a loop
over all possible values for α (form 10 to 500, with
a step of 10), that corresponds to a maximum of 50
values. This is why our optimal α does not add any
cost to the global particle filter algorithm. Note that
sometimes our computation times are lower: because
our likelihood density is better adapted to the particle
set, the resampling can require less time. We get an
average of +0.42% of the computation times for all
50 tested sequences.
5 CONCLUSION
In this paper, we have presented an approach that au-
tomatically adapts over time fundamental parameters
of the likelihood function of the particle filter. More
precisely, using a single and simple criterion, it can
set the correction parameter as well as the propaga-
tion parameter. Moreover, it does not require any ad-
ditional cost in term of computation times.
Our tests have proven on several and challenging
video sequences the high impact of these two param-
eters on the tracking performances. However, they
are often neglected and set up with fixed and arbi-
trary values that can, as we have shown, increase the
tracking errors. Our experiments also show, that our
method which adapts these parameters depending on
the context greatly improves the robustness of the par-
ticle filter. Moreover, it converges better, faster and
is more stable. Particularly, our algorithm still suc-
ceeds in complex tracking situations like illumination
changes or proximity between similar color objects.
Further works will concern the validation of our
approach by using different kinds of descriptors (such
as wavelets), similarity measures (such as Chamfer)
in order to prove the generalization of our technique.
We also attempt to show our approach can also im-
prove multi-cue or multi-modal tracking accuracies.
Finally, we are working on the derivation of a mathe-
matical demonstration of the validity of our criterion
used to determine the optimal correction value.
REFERENCES
Bhattacharyya, A. (1943). On a measure of divergence
between two statistical populations defined by their
probability distributions. Bulletin of Cal. Math. Soc.,
35(1):99–109.
Brasnett, P. and Mihaylova, L. (2007). Sequential monte
carlo tracking by fusing multiple cues in video se-
quences. Image and Vision Computing, 25(8):1217–
1227.
Dalal, N. and Triggs, B. (2005). Histograms of oriented
gradients for human detection. In CVPR, pages 886–
893.
Erdem, E., Dubuisson, S., and Bloch, I. (2012).
Visual tracking by fusing multiple cues with
context-sensitive reliabilities. Pattern Recognition,
45(5):1948–1959.
Fontmarty, M., Lerasle, F., and Danes, P. (2009). Likelihood
tuning for particle filter in visual tracking. In ICIP,
pages 4101–4104.
Gordon, N., Salmond, D., and Smith, A. (1993). Novel ap-
proach to nonlinear/non-Gaussian Bayesian state esti-
mation. IEE Proc. of Radar and Signal Processing,
140(2):107–113.
Hassan, W., Bangalore, N., Birch, P., Young, R., and
Chatwin, C. (2012). An adaptive sample count parti-
cle filter. Computer Vision and Image Understanding,
116(12):1208–1222.
Lichtenauer, J., Reinders, M., and Hendriks, E. (2004). In-
fluence of the observation likelihood function on ob-
ject tracking performance in particle filtering. In FG,
pages 767–772.
Maggio, E., Smerladi, F., and Cavallaro, A. (2007). Adap-
tive Multifeature Tracking in a Particle Filtering
Framework. IEEE Transactions on Circuits and Sys-
tems for Video Technology, 17(10):1348–1359.
Ng, K. K. and Delp, E. J. (2009). New models for real-
time tracking using particle filtering. In VCIP, volume
7257.
P
´
erez, P., Hue, C., Vermaak, J., and Gangnet, M. (2002).
Color-Based Probabilistic Tracking. In ECCV, pages
661–675.
Wu, Y., Lim, J., and Yang, M.-H. (2013). Online object
tracking: A benchmark. In CVPR, pages 2411–2418.
ASelf-adaptiveLikelihoodFunctionforTrackingwithParticleFilter
453