Exploiting Scene Cues for Dropped Object Detection

Adolfo Lopez-Mendez, Florent Monay, Jean-Marc Odobez


This paper presents a method for the automated detection of dropped objects in surveillance scenarios, which is a very important task for abandoned object detection. Our method works in single views and exploits prior information of the scene, such as geometry or the fact that a number of false alarms are caused by known objects, such as humans. The proposed approach builds dropped object candidates by analyzing blobs obtained with a multi-layer background subtraction approach. The created dropped object candidates are then characterized both by appearance and by temporal aspects such as the estimated drop time. Next, we incorporate prior knowledge about the possible sizes and positions of dropped objects through an efficient filtering approach. Finally, the output of a human detector is exploited over in order to filter out static objects that are likely to be humans that remain still. Experimental results on the publicly available PETS2006 datasets and on several long sequences recorded in metro stations show the effectiveness of the proposed approach. Furthermore, our approach can operate in real-time.


  1. Caro Campos, L., SanMiguel, J., and Martinez, J. (30 2011- Sept. 2). Discrimination of abandoned and stolen object based on active contours. In AVSS 2011, pages 101-106.
  2. Comaniciu, D. and Meer, P. (2002). Mean shift: a robust approach toward feature space analysis. TPAMI, 24(5):603-619.
  3. Dalal, N. and Triggs, B. (June). Histograms of oriented gradients for human detection. In CVPR 2005, volume 1, pages 886-893 vol. 1.
  4. Dubout, C. and Fleuret, F. (2012). Exact acceleration of linear object detectors. In ECCV 2012, pages 301- 311, Berlin, Heidelberg. Springer-Verlag.
  5. Fan, Q. and Pankanti, S. (30 2011-Sept. 2). Modeling of temporarily static objects for robust abandoned object detection in urban surveillance. In AVSS 2011, pages 36-41.
  6. Fan, Q. and Pankanti, S. (Sept.). Robust foreground and abandonment analysis for large-scale abandoned object detection in complex surveillance videos. In AVSS 2012, pages 58-63.
  7. Felzenszwalb, P., Girshick, R., McAllester, D., and Ramanan, D. (2010). Object detection with discriminatively trained part-based models. TPAMI, 32(9):1627 -1645.
  8. Heikkila, M. and Pietikainen, M. (April 2006). A texturebased method for modeling the background and detecting moving objects. TPAMI, 28(4):657-662.
  9. Liao, H.-H., Chang, J.-Y., and Chen, L.-G. (Sept.). A localized approach to abandoned luggage detection with foreground-mask sampling. In AVSS 2008, pages 132- 139.
  10. PETS 2006. http://www.cvg.rdg.ac.uk/PETS2006/data.html.
  11. Smith, K. C., Quelhas, P., and Gatica-Perez, D. (2006). Detecting abandoned luggage items in a public space. In IEEE PETS.
  12. Tian, Y., Feris, R., Liu, H., Hampapur, A., and Sun, M.-T. (Sept. 2011). Robust detection of abandoned and removed objects in complex surveillance videos. TSMCC, 41(5):565-576.
  13. Yao, J. and Odobez, J.-M. (2007). Multi-layer background subtraction based on color and texture. In CVPR 2007, pages 1-8.

Paper Citation

in Harvard Style

Lopez-Mendez A., Monay F. and Odobez J. (2014). Exploiting Scene Cues for Dropped Object Detection . In Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014) ISBN 978-989-758-004-8, pages 14-21. DOI: 10.5220/0004654800140021

in Bibtex Style

author={Adolfo Lopez-Mendez and Florent Monay and Jean-Marc Odobez},
title={Exploiting Scene Cues for Dropped Object Detection},
booktitle={Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)},

in EndNote Style

JO - Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)
TI - Exploiting Scene Cues for Dropped Object Detection
SN - 978-989-758-004-8
AU - Lopez-Mendez A.
AU - Monay F.
AU - Odobez J.
PY - 2014
SP - 14
EP - 21
DO - 10.5220/0004654800140021