R. Reulke, S. Bauer, T. Döring, R. Spangenberg


Non-intrusive video-detection for traffic flow observation and surveillance is the primary alternative to conventional inductive loop detectors. Video Image Detection Systems (VIDS) can derive traffic parameters by means of image processing and pattern recognition methods. Existing VIDS emulate the inductive loops. We propose a trajectory based recognition algorithm to expand the common approach and to obtain new types of information (e.g. queue length or erratic movements).Different views of the same area by more than one camera sensor are necessary, because of the typical limitations of single camera systems, resulting from occlusions by other cars, trees and traffic signs. A distributed cooperative multi-camera system enables a significant enlargement of the observation area. The trajectories are derived from multi-target tracking. The fusion of object data from different cameras will be done by a tracking approach. This approach opens up opportunities to identify and specify traffic objects, their location, speed and other characteristic object information. The system creates new derived and consolidated information of traffic participants. Thus, also descriptions of individual traffic participants are possible.


  1. Blackman, S.S. (1986). Multiple-Target Tracking with Radar Applications, MA: Artech House, Dedham.
  2. Brown, D.C. (1971). Close range camera calibration. Photogrammetric Engineering, 37(8):855-866.
  3. Collins, R., Amidi, O. and Kanade, T. (2002). An Active Camera System for Acquiring Multi-View Video. International Conference on Image Processing (ICIP), Rochester, NY, (2002):131-140.
  4. Datta, T.K., Schattler, K. and Datta, S. (2000). Red light violations and crashes at urban intersections. Highway and Traffic Safety: Engineering, Evaluation, and Enforcement; Trucking and Motorcycles, (2000):52- 58.
  5. Harlow, C. and Wang, Y. (2001). Automated accident detection system. Highway Safety: Modeling, Analysis, Management, Statistical Methods, and Crash Location, (1746):90-93.
  6. Kastrinaki, V., Zervakis, M., Kalaitzakis, K. (2003). A survey of video processing techniques for traffic applications. Image and Vision Computing, 21(4):359- 381.
  7. Klein, L. A., Kelley, M. R., Mills, M. K. (1997). Evaluation of overhead and in-ground vehicle detector technologies for traffic flow measurement. Journal of Testing and Evaluation, 25(2):205-214.
  8. Kumar, P., Ranganath, S., Huang W.M., and Sengupta, K. (2005). Framework for real-time behaviour interpretation from traffic video. IEEE Transactions on Intelligent Transportation Systems, 6(1):43-53.
  9. Liscano, R. and Green, D. (1989). Design and implementation of a trajectory generator for an indoor mobile robot. Proceedings of the IEEE/RJS International Conference on Intelligent Robots and Systems. Tsukuba, Japan, (1989):380-385.
  10. Luhmann, T., Robson, S., Kyle, S. and Harley, I. (2006). Close-Range Photogrammetry. Whittles Publishing.
  11. Luo, X.Z. and Bhandarkar, S.M. (2005). Real-time and robust background updating for video surveillance and monitoring. Image Analysis and Recognition, 3656:1226-1233.
  12. Matsuyama, T. and Ukita, N. (2002). Real-time multitarget tracking by a cooperative distributed vision system. Proceedings of the IEEE, 90(7):1136-115.
  13. Meagher, T., Maire, F. and Wong, O. (2004). A Robust Multi-Camera Approach to Object Tracking and Position Determination using a Self-Organising Map Initialised through Cross-Ratio Generated “Virtual Point”. CIMCA'04, Gold Coast, Australia.
  14. Michalopoulus, P.G. (1991). Vehicle Detection Video through Image-Processing - the Autoscope System. IEEE Transactions on Vehicular Technology, 40(1):21-29.
  15. Mittal, A. and Davis, L. (2001). Unified Multi-camera Detection and Tracking Using Region-Matching. viewed 20 August 2007, <>.
  16. Nelson, W.L. (1989). Continuous Steering-Function Control of Robot Carts. IEEE Transactions on Industrial Electronics,36(3):330-337.
  17. Remondino, F. and Fraser, C. (2006). Digital Camera Calibration Methods: Considerations and Comparisons. ISPRS Commission V Symposium 'Image Engineering and Vision Metrology', (2006):266-272.
  18. Spangenberg, R. and Doering, T. (2006). Evaluation of object tracking in traffic scenes. ISPRS, Commission V Symposium, Image Engineering and Vision Metrology, Dresden, Germany.
  19. Setchell, C. and Dagless, E.L. (2001). Vision-based roadtraffic monitoring sensor. IEEE Proceedings-Vision Image and Signal Processing, 148(1):78-84.
  20. Wigan, M.R. (1992). Image-Processing Techniques Applied to Road Problems. Journal of Transportation, Engineering-Asce, 118(1):62-83.
  21. Yung, N.H.C.and Lai, A.H.S. (2001). An effective video analysis method for detecting red light runners. IEEE Transactions on Vehicular Technology, 50(4):1074- 1084.

Paper Citation

in Harvard Style

Reulke R., Bauer S., Döring T. and Spangenberg R. (2008). MULTI-CAMERA DETECTION AND MULTI-TARGET TRACKING - Traffic Surveillance Applications . In Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008) ISBN 978-989-8111-21-0, pages 585-591. DOI: 10.5220/0001085705850591

in Bibtex Style

author={R. Reulke and S. Bauer and T. Döring and R. Spangenberg},
title={MULTI-CAMERA DETECTION AND MULTI-TARGET TRACKING - Traffic Surveillance Applications},
booktitle={Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)},

in EndNote Style

JO - Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)
SN - 978-989-8111-21-0
AU - Reulke R.
AU - Bauer S.
AU - Döring T.
AU - Spangenberg R.
PY - 2008
SP - 585
EP - 591
DO - 10.5220/0001085705850591