Optimizing Camera Placement in Motion Tracking Systems

Dávid Szalóki, Kristóf Csorba, Gábor Tevesz

Abstract

This paper discusses the placement of cameras in order to achieve the highest possible localization accuracy. It is reached by using several cameras with redundant fields of views. A camera model is introduced and the components which cause the localization errors are identified. The localization accuracy measure is defined for one and for multiple cameras too. The problem of adding a new camera to the system in order to improve the accuracy is formulated. The method for finding the optimal placement of this new camera is presented. Some features are enumerated which can be applied for getting an advanced method.

References

  1. Bradski, G. (2000). The OpenCV Library. Dr. Dobb's Journal of Software Tools.
  2. Bradski, G. and Kaehler, A. (2008). Learning OpenCV. O'Reilly Media Inc.
  3. Ercan, A., El Gamal, A., and Guibas, L. (2007). Object tracking in the presence of occlusions via a camera network. In Information Processing in Sensor Networks, 2007. IPSN 2007. 6th International Symposium on, pages 509-518.
  4. Hesch, J. and Roumeliotis, S. (2011). A direct least-squares (dls) method for pnp. In Computer Vision (ICCV), 2011 IEEE International Conference on, pages 383- 390.
  5. Käppeler, U.-P., Höferlin, M., and Levi, P. (2010). 3d object localization via stereo vision using an omnidirectional and a perspective camera.
  6. Lepetit, V. and Fua, P. (2006). Keypoint recognition using randomized trees. IEEE Trans. Pattern Anal. Mach. Intell., 28(9):1465-1479.
  7. Moreno-Noguer, F., Lepetit, V., and Fua, P. (2007). Accurate non-iterative o(n) solution to the pnp problem. In Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, pages 1-8.
  8. Oberkampf, D., DeMenthon, D. F., and Davis, L. S. (1996). Iterative pose estimation using coplanar feature points. Comput. Vis. Image Underst., 63(3):495- 511.
  9. Skrypnyk, I. and Lowe, D. G. (2004). Scene modelling, recognition and tracking with invariant image features. In Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 7804, pages 110-119, Washington, DC, USA. IEEE Computer Society.
  10. Smart Mobile Eyes for Localization Szalóki, D., Koszó, N., Csorba, K., and Tevesz, G. (2013a). Marker localization with a multi-camera system. In 2013 IEEE International Conference on System Science and Engineering (ICSSE), pages 135-139.
  11. Szalóki, D., Koszó, N., Csorba, K., and Tevesz, G. (2013b). Optimizing camera placement for localization accuracy. In 14th IEEE International Symposium on Computational Intelligence and Informatics (CINTI), pages 207-212.
  12. Wu, Y. and Hu, Z. (2006). Pnp problem revisited. J. Math. Imaging Vis., 24(1):131-141.
  13. Zhou, Q. and Aggarwal, J. (2006). Object tracking in an outdoor environment using fusion of features and cameras. Image and Vision Computing, 24(11):1244 - 1255. Performance Evaluation of Tracking and Surveillance.
Download


Paper Citation


in Harvard Style

Szalóki D., Csorba K. and Tevesz G. (2014). Optimizing Camera Placement in Motion Tracking Systems . In Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 978-989-758-040-6, pages 288-295. DOI: 10.5220/0005012202880295


in Bibtex Style

@conference{icinco14,
author={Dávid Szalóki and Kristóf Csorba and Gábor Tevesz},
title={Optimizing Camera Placement in Motion Tracking Systems},
booktitle={Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2014},
pages={288-295},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005012202880295},
isbn={978-989-758-040-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - Optimizing Camera Placement in Motion Tracking Systems
SN - 978-989-758-040-6
AU - Szalóki D.
AU - Csorba K.
AU - Tevesz G.
PY - 2014
SP - 288
EP - 295
DO - 10.5220/0005012202880295