REAL TIME OBJECT TRACKING ON GPGPU

Maciej Chociej, Adam Polak

2012

Abstract

We propose a system for tracking objects in a video stream from a stationary camera. Our method, as often used, involves foreground-background separation and optical flow calculation. The major finding is fast feedback process that leads to an accurate detection of background-object and object-object boundaries and maintaining them during object occlusions. The contribution of this paper also includes improvements to computing dense optical flow and foreground separation. The methods described were implemented on a GPGPU and yield performance results sufficient for real time processing. Additionally, our approach makes no a priori assumptions on the characteristics of tracked objects and can be utilized to track both rigid and deformable objects of various shapes and sizes.

References

  1. Bouguet, J.-Y. (2000). Pyramidal implementation of the lucas kanade feature tracker description of the algorithm.
  2. Bruhn, A., Weickert, J., and Schnörr, C. (2005). Lucas kanade meets horn schunck: Combining local and global optic flow methods. International Journal of Computer Vision.
  3. Bugeaue, A. and Pérezz, P. (2008). Track and cut: Simultaneous tracking and segmentation of multiple objects with graph cuts. Journal on Image and Video Processing.
  4. Donoser, M., Arth, C., and Bischof, H. (2007). Detecting, tracking and recognizing license plates. In Proceedings of the 8th Asian conference on Computer vision - Volume Part II, Berlin, Heidelberg. Springer-Verlag.
  5. Feris, R., Siddiquie, B., Zhai, Y., Petterson, J., Brown, L., and Pankanti, S. (2011). Attribute-based vehicle search in crowded surveillance videos. In Proceedings of the 1st ACM International Conference on Multimedia Retrieval, New York, NY, USA. ACM.
  6. Horn, B. and Schunck, B. (1981). Determining optical flow. Artifical Intelligence.
  7. Kanade, T. and Okutomi, M. (1994). A stereo matching algorithm with an adaptive window: Theory and experiment. IEEE Trans. Pattern Anal. Mach. Intell.
  8. Karlsson, S., Taj, M., and Cavallaro, A. (2008). Detection and tracking of humans and faces. Journal on Image and Video Processing.
  9. Lucas, B. and Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence - Volume 2, San Francisco, CA, USA. Morgan Kaufmann Publishers Inc.
  10. Makris, A., Kosmopoulos, D., Perantonis, S., and Theodoridis, S. (2011). A hierarchical feature fusion framework for adaptive visual tracking. Image Vision Comput.
  11. Shi, J. and Tomasi, C. (1993). Good features to track. Technical report, Cornell University, Ithaca, NY, USA.
  12. Sun, D., Sudderth, E., and Black, M. J. (2010). Layered image motion with explicit occlusions, temporal consistency, and depth ordering.
  13. Tomasi, C. and Kanade, T. (1991). Shape and motion from image streams: a factorization method - part 3 detection and tracking of point features. Technical report, Pittsburgh, PA, USA.
  14. Wu, B. and Nevatia, R. (2007). Detection and tracking of multiple, partially occluded humans by bayesian combination of edgelet based part detectors. International Journal of Computer Vision.
  15. Xu, L., Jia, J., and Matsushita, Y. (2010). Motion detail preserving optical flow estimation. In Computer Vision and Pattern Recognition. IEEE Computer Society.
  16. Yang, R., Pollefeys, M., and Li, S. (2004). Improved realtime stereo on commodity graphics hardware. In Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 3 - Volume 03, Washington, DC, USA. IEEE Computer Society.
  17. Yilmazr, A., Javed, O., and Shah, M. (2006). Object tracking: A survey. ACM Computing Survey, 38.
  18. Zweng, A. and Kampel, M. (2009). High performance implementation of license plate recognition in image sequences. In Proceedings of the 5th International Symposium on Advances in Visual Computing: Part II, Berlin, Heidelberg. Springer-Verlag.
Download


Paper Citation


in Harvard Style

Chociej M. and Polak A. (2012). REAL TIME OBJECT TRACKING ON GPGPU . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2012) ISBN 978-989-8565-04-4, pages 303-310. DOI: 10.5220/0003862303030310


in Bibtex Style

@conference{visapp12,
author={Maciej Chociej and Adam Polak},
title={REAL TIME OBJECT TRACKING ON GPGPU},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2012)},
year={2012},
pages={303-310},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003862303030310},
isbn={978-989-8565-04-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2012)
TI - REAL TIME OBJECT TRACKING ON GPGPU
SN - 978-989-8565-04-4
AU - Chociej M.
AU - Polak A.
PY - 2012
SP - 303
EP - 310
DO - 10.5220/0003862303030310