MODELLING A BACKGROUND FOR BACKGROUND
SUBTRACTION FROM A SEQUENCE OF IMAGES
Formulation of Probability Distribution of Pixel Positions
Suil Son, Young-Woon Cha and Suk I. Yoo
School of Computer Science and Engineering, Seoul National University, 599 Gwanak-ro, Seoul, Korea
Keywords: Background subtraction, Foreground detection, Probability distribution of pixel positions, Intensity clusters,
Kernel density estimation.
Abstract: This paper presents a new background subtraction approach to identifying the various changes of objects in
a sequence of images. A background is modelled as the probability distribution of pixel positions given
intensity clusters, which is constructed from a given sequence of images. Each pixel position in a new image
is then identified with either a background or a foreground, depending on its value from probability
distribution of pixel positions representing a background. The presented approach is illustrated using two
examples. As compared to traditional intensity-based approaches, this approach is shown to be robust to
dynamic textures and various changes of illumination.
1 INTRODUCTION
Detecting a meaningful foreground from a sequence
of images, known as background subtraction, has
been studied intensively due to its wide area of
application such as tracking, identification and
surveillance. Two issues in developing background
subtraction methods are how to resolve the change
of illumination due to noise or light and how to
manage dynamic textures such as swaying tree or
flow of water.
To manage the change of illumination, most of
background subtraction methods have used intensity
distributions (Wren, Darrelll and Pentland, 1997,
Stauffer and Grimson, 1999, Elgammal, Harwood,
and Davis, 2000, Power and Schoonees, 2002,
Zivkovic and Heijden, 2006, Dalley, Migdal, and
Grimson, 2008). Using intensity distributions,
however, does not work very well when there is the
large change of illumination in all pixels on the
image. To resolve dynamic textures, a mixture of
Gaussian (Stauffer and Grimson, 1999, Power and
Schoonees, 2002, Zivkovic and Heijden, 2006,
Dalley et al, 2008) and the kernel density estimation
(Elgammal et al, 2000, Mittal and Paragios, 2004)
have been suggested. To recognize the background
having small motion correctly, spatial information of
objects is necessary. A window formed with
neighbours of a pixel may be used to reflect such
spatial information (Elgammal et al, 2000, Dalley et
al, 2008). Although the approach using windows
reduces false detection of foreground for dynamic
textures, it is not easy to define the exact size of a
window in advance. Sheikh and Shah (2005)
suggested joint distribution of positions and
intensities to reflect the spatial information. Since
this joint representation of image pixels reflects the
local spatial structure, it works well on motion of
background objects. However, it has a difficulty
with the curse of dimensionality from its high
dimensional data representation.
This paper presents a new approach to relaxing
those difficulties of the traditional background
subtraction methods: A background is modelled as
the probability distribution of pixel positions given
intensity clusters. An image in a given sequence is
assumed to have M intensity sources. From the
image having M intensity sources, M intesity
clusters can be formulated. Although it is not easy to
figure out the optimal number of M from a given
image, specially when the image is complex with
various objects, the value of M is assumed to be not
larger than six in general due to the range of grey
level from 0 to 255.
For each of the M intensity clusters, the distribu-
tion of pixel positions is then computed from the
sequence of images. The computed distribution of
545
Son S., Cha Y. and Yoo S..
MODELLING A BACKGROUND FOR BACKGROUND SUBTRACTION FROM A SEQUENCE OF IMAGES - Formulation of Probability Distribution of
Pixel Positions.
DOI: 10.5220/0003141105450551
In Proceedings of the 3rd International Conference on Agents and Artificial Intelligence (ICAART-2011), pages 545-551
ISBN: 978-989-8425-40-9
Copyright
c
2011 SCITEPRESS (Science and Technology Publications, Lda.)