Control of a PTZ Camera in a Hybrid Vision System

Francois Rameau, Cédric Demonceaux, Désiré Sidibé, David Fofi

2014

Abstract

In this paper, we propose a new approach to steer a PTZ camera in the direction of a detected object visible from another fixed camera equipped with a fisheye lens. This heterogeneous association of two cameras having different characteristics is called a hybrid stereo-vision system. The presented method employs epipolar geometry in a smart way in order to reduce the range of search of the desired region of interest. Furthermore, we proposed a target recognition method designed to cope with the illumination problems, the distortion of the omnidirectional image and the inherent dissimilarity of resolution and color responses between both cameras. Experimental results with synthetic and real images show the robustness of the proposed method.

References

  1. Adorni, G., Bolognini, L., Cagnoni, S., and Mordonini, M. (2002). Stereo obstacle detection method for a hybrid omni-directional/pin-hole vision system. In RoboCup 2001: Robot Soccer World Cup V, pages 244-250, London, UK. Springer-Verlag.
  2. Agapito, L., Hayman, E., and Reid, I. (2001). Selfcalibration of rotating and zooming cameras. International Journal of Computer Vision, 45(2):107-127.
  3. Amine Iraqui, H., Dupuis, Y., Boutteau, R., Ertaud, J.-Y., and Savatier, X. (2010). Fusion of omnidirectional and ptz cameras for face detection and tracking. In Emerging Security Technologies (EST), 2010 International Conference on, pages 18-23. IEEE.
  4. Badri, J., Tilmant, C., Lavest, J., Pham, Q., and Sayd, P. (2007). Camera-to-camera mapping for hybrid pantilt-zoom sensors calibration. In SCIA, pages 132- 141.
  5. Barreto, J. P. and Araujo, H. (2001). Issues on the geometry of central catadioptric image formation. In CVPR, pages 422-427.
  6. Bazin, J., Kim, S., Ghoi, D., J.Y.Lee, and Kweon, I. (2011). Mixing collaborative and hybrid vision devices for robotics applications. journal of Korea Robotics Society.
  7. Bouguet, J. Y. (2008). Camera calibration toolbox for Matlab.
  8. Chen, C.-H., Yao, Y., Page, D. L., Abidi, B. R., Koschan, A., and Abidi, M. A. (2008). Heterogeneous fusion of omnidirectional and ptz cameras for multiple object tracking. IEEE Trans. Circuits Syst. Video Techn., 18(8):1052-1063.
  9. Courbon, J., Y.Mezouar, and Martinet, P. (2012). Evaluation of the unified model of the sphere for fisheye cameras in robotic applications. Advanced Robotics, 26(8-9):947-967.
  10. Cui, Y., Samarasekera, S., Huang, Q., Greienhagen, M., and Enhagen, M. G. (1998). Indoor monitoring via the collaboration between a peripheral sensor and a foveal senor. In In Proc. of the IEEE Workshop on Visual Surveillance, pages 2-9. IEEE Computer Society.
  11. Cyganek, B. and GruszczyÁski, S. (2013). Hybrid computer vision system for drivers' eye recognition and fatigue monitoring. Neurocomputing.
  12. Ding, C., Song, B., Morye, A., Farrell, J. A., and RoyChowdhury, A. K. (2012). Collaborative sensing in a distributed ptz camera network. Image Processing, IEEE Transactions on, 21(7):3282-3295.
  13. Eynard, D., Vasseur, P., Demonceaux, C., and Frémont, V. (2012). Real time uav altitude, attitude and motion estimation from hybrid stereovision. Autonomous Robots, 33(1-2):157-172.
  14. Fischler, M. A. and Bolles, R. C. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM, 24(6):381-395.
  15. Fujiki, J., Torii, A., and Akaho, S. (2007). Epipolar geometry via rectification of spherical images. In Proceedings of the 3rd international conference on Computer vision/computer graphics collaboration techniques, MIRAGE'07, pages 461-471, Berlin, Heidelberg. Springer-Verlag.
  16. Gould, S., Arfvidsson, J., Kaehler, A., Messner, M., Bradski, G., Baumstarck, P., Chung, S., and Ng, A. Y. (2007). Peripheral-foveal vision for real-time object recognition and tracking in video. In In International Joint Conference on Artificial Intelligence (IJCAI.
  17. Liao, H. C. and Cho, Y. C. (2008). A new calibration method and its application for the cooperation of wide-angle and pan-tilt-zoom cameras. Information Technology Journal, 7(8):1096-1105.
  18. Marr, D. and Poggio, T. (1977). A theory of human stereo vision. Technical report, Cambridge, MA, USA.
  19. Mei, C., Benhimane, S., Malis, E., and Rives, P. (2008). Efficient homography-based tracking and 3-d reconstruction for single-viewpoint sensors. IEEE Transactions on Robotics, 24(6):1352-1364.
  20. Mei, C. and Rives, P. (2007). Single view point omnidirectional camera calibration from planar grids. In IEEE International Conference on Robotics and Automation.
  21. Micheloni, C., Rinner, B., and Foresti, G. L. (2010). Video analysis in pan-tilt-zoom camera networks. Signal Processing Magazine, IEEE, 27(5):78-90.
  22. Neves, A. J., Martins, D. A., and Pinho, A. J. (2008). A hybrid vision system for soccer robots using radial search lines. In Proc. of the 8th Conference on Autonomous Robot Systems and Competitions, Portuguese Robotics Open-ROBOTICA, pages 51-55.
  23. Puwein, J., Ziegler, R., Ballan, L., and Pollefeys, M. (2012). Ptz camera network calibration from moving people in sports broadcasts. In Applications of Computer Vision (WACV), 2012 IEEE Workshop on, pages 25-32. IEEE.
  24. Raj, A., Khemmar, R., Eratud, J. Y., and Savatier, X. (2013). Face detection and recognition under heterogeneous database based on fusion of catadioptric and ptz vision sensors. In Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013, pages 171-185. Springer.
  25. Rameau, F., Habed, A., Demonceaux, C., Sidibé, D., and Fofi, D. (2012). Self-calibration of a ptz camera using new lmi constraints. In ACCV.
  26. Scotti, G., Marcenaro, L., Coelho, C., Selvaggi, F., and Regazzoni, C. (2005). Dual camera intelligent sensor for high definition 360 degrees surveillance. IEE Proceedings-Vision, Image and Signal Processing, 152(2):250-257.
  27. Ying, X. and Hu, Z. (2004). Can we consider central catadioptric cameras and fisheye cameras within a unified imaging model. In ECCV, pages 442-455.
Download


Paper Citation


in Harvard Style

Rameau F., Demonceaux C., Sidibé D. and Fofi D. (2014). Control of a PTZ Camera in a Hybrid Vision System . In Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014) ISBN 978-989-758-009-3, pages 397-405. DOI: 10.5220/0004734703970405


in Bibtex Style

@conference{visapp14,
author={Francois Rameau and Cédric Demonceaux and Désiré Sidibé and David Fofi},
title={Control of a PTZ Camera in a Hybrid Vision System},
booktitle={Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014)},
year={2014},
pages={397-405},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004734703970405},
isbn={978-989-758-009-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014)
TI - Control of a PTZ Camera in a Hybrid Vision System
SN - 978-989-758-009-3
AU - Rameau F.
AU - Demonceaux C.
AU - Sidibé D.
AU - Fofi D.
PY - 2014
SP - 397
EP - 405
DO - 10.5220/0004734703970405