A NEW SET OF FEATURES FOR ROBUST CHANGE
DETECTION
José Sigut, Sid-Ahmed Ould Sidha, Juan Díaz and Carina González
Department of Systems Engineering and Computer Architecture, University of La Laguna, Tenerife, Spain
Keywords: Change detection, motion detection, image differencing, robustness, illumination changes.
Abstract: A new set of features for robust change detection is proposed. These features are obtained from a
transformation of the thresholded intensity difference image. Their performance is tested on two video
sequences acquired in a human-machine interaction scenario under very different illumination conditions.
Several performance measures are computed and a comparison with other well known classical change
detection methods is done. The performed experiments show the effectiveness and robustness of our
proposal.
1 INTRODUCTION
Detecting regions of change in images of the same
scene taken at different times is of widespread
interest due to a large number of applications in
diverse disciplines. Common applications of image
differencing include object tracking, intruder
surveillance systems, vehicle surveillance systems
and interframe data compression (Radke et al, 2005).
Due to its simplicity, image differencing has become
a very popular method for change detection. It only
requires calculating the absolute values of the
difference between the corresponding pixels in the
two frames considered. In the context of surveillance
applications, each frame is usually compared against
a reference or background model (Cheung and
Kamath, 2004), (Migliore et al, 2006). Large values
in the difference map indicate regions of change.
The crucial point here is the determination of the
optimal decision thresholds allowing for minimal
error probabilities and thus guaranteeing results
which are robust against noise changes over time,
e.g. due to changes in illumination conditions. This
indicates that in general threshold values should be
calculated dynamically based on the image content
and that empirically selecting a value is not
appropriate for most applications. Rosin, (Rosin,
2002), (Rosin and Ioannidis, 2003) surveyed and
reported experiments on many different criteria for
choosing the decision threshold.
The decision rule in many change detection
algorithms is cast as a statistical hypothesis testing.
The decision as to whether or not a change has
occurred at a given pixel corresponds to choosing
one of two hypotheses: the null hypothesis
0
H or
the alternative hypothesis
1
H , corresponding to no-
change and change decisions respectively.
Characterizing the null hypothesis is usually much
easier, since in the absence of any change, the
difference between image intensities can be assumed
to be due to noise alone. A significance test on the
difference image can be performed to assess how
well the null hypothesis describes the observations,
and this hypothesis is correspondingly accepted or
rejected. Modelling the background noise in static
applications is straightforward since any required
estimation can be done off-line for the used camera
system. However, a real time sequence is much
more challenging since noise features may change
over time and noise estimation must be done on-line
from unchanged regions which are not known a
priori (Thoma and Bierling, 1989). Aach et al (Aach
et al, 1993), (Aach et al, 2001) characterized the
noise in moving video as zero-mean Gaussian
random variables. The variances for the noise were
estimated from regions with very small intensity
differences. Bruzzone and Prieto (Bruzzone and
Prieto, 2000) noted that while the variances
estimated this way may serve as good initial guesses,
using them in a decision rule may result in a false
alarm rate different from the desired value.
In this paper, background noise is modelled by
using a new set of features as an alternative to the
usual intensity differences. We will show the
robustness of this approach to changes in the
illumination conditions. Section 2 of this paper
592
Sigut J., Ould Sidha S., Díaz J. and González C. (2008).
A NEW SET OF FEATURES FOR ROBUST CHANGE DETECTION.
In Proceedings of the Third International Conference on Computer Vision Theory and Applications, pages 592-596
DOI: 10.5220/0001080205920596
Copyright
c
SciTePress