RGB-D Tracking and Reconstruction for TV Broadcasts

Tommi Tykkälä, Hannu Hartikainen, Andrew I. Comport, Joni-Kristian Kämäräinen

Abstract

In this work, a real-time image-based camera tracking solution is developed for television broadcasting studio environments. An affordable vision-based system is proposed which can compete with expensive matchmoving systems. The system requires merely commodity hardware: a low cost RGB-D sensor and a standard laptop. The main contribution is avoiding time-evolving drift by tracking relative to a pre-recorded keyframe model. Camera tracking is defined as a registration problem between the current RGB-D measurement and the nearest keyframe. The keyframe poses contain only a small error and therefore the proposed method is virtually driftless. Camera tracking precision is compared to KinectFusion, which is a recent method for simultaneous camera tracking and 3D reconstruction. The proposed method is tested in a television broadcasting studio, where it demonstrates driftless and precise camera tracking in real-time.

References

  1. Audras, C., Comport, A. I., Meilland, M., and Rives, P. (2011). Real-time dense rgb-d localisation and mapping. In Australian Conference on Robotics and Automation. Monash University, Australia, 2011.
  2. Baker, S. and Matthews, I. (2004). Lucas-kanade 20 years on: A unifying framework. Int. J. Comput. Vision, 56(3):221-255.
  3. Bouguet, J.-Y. (2010). Camera calibration toolbox for matlab. http://www.vision.caltech.edu/bouguetj/ calib doc.
  4. Comport, A., Malis, E., and Rives, P. (2007). rate quadri-focal tracking for robust 3d visual odometry. In IEEE Int. Conf. on Robotics and Automation, ICRA'07, Rome, Italy.
  5. Davison, A., Reid, I., Molton, N., and Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. PAMI, 29:1052-1067.
  6. Dobbert, T. (2005). Matchmoving: The Invisible Art of Camera Tracking. Sybex.
  7. Henry, P., Krainin, M., Herbst, E., Ren, X., and Fox, D. (2012). RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. The International Journal of Robotics Research, 31(5):647-663.
  8. Herrera, C., Kannala, J., and Heikkila, J. (2012). Joint depth and color camera calibration with distortion correction. IEEE PAMI, 34(10).
  9. Kato, H. and Billinghurst, M. (1999). Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In Proceedings of the 2nd International Workshop on Augmented Reality (IWAR 99), San Francisco, USA.
  10. Klein, G. and Murray, D. (2007). Parallel tracking and mapping for small ar workspaces. Proceedings of the International Symposium on In Mixed and Augmented Reality (ISMAR), pages 225-234.
  11. Ma, Y., Soatto, S., Kosecka, J., and Sastry, S. (2004). An invitation to 3-D vision: from images to geometric models, volume 26 of Interdisciplinary applied mathematics. Springer, New York.
  12. Newcombe, R., Lovegrove, S., and Davison, A. (2011a). Dtam: Dense tracking and mapping in real-time. In ICCV, volume 1.
  13. Newcombe, R. A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A. J., Kohli, P., Shotton, J., Hodges, S., and Fitzgibbon, A. (2011b). Kinectfusion: Realtime dense surface mapping and tracking. ISMAR, pages 127-136.
  14. Rusu, R. B. and Cousins, S. (2011). 3D is here: Point Cloud Library (PCL). In IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China.
  15. Snavely, N., Seitz, S. M., and Szeliski, R. (2006). Photo tourism: Exploring photo collections in 3d. In ACM TRANSACTIONS ON GRAPHICS, pages 835-846. Press.
  16. Sturm, J., Magnenat, S., Engelhard, N., Pomerleau, F., Colas, F., Burgard, W., Cremers, D., and Siegwart, R. (2011). Towards a benchmark for rgb-d slam evaluation. In Proc. of the RGB-D Workshop on Advanced Reasoning with Depth Cameras at Robotics: Science and Systems Conf. (RSS), Los Angeles, USA.
  17. Triggs, B., McLauchlan, P., Hartley, R., and Fitzgibbon, A. (2000). Bundle adjustment - a modern synthesis. In Triggs, B., Zisserman, A., and Szeliski, R., editors, Vision Algorithms: Theory and Practice, volume 1883 of Lecture Notes in Computer Science, pages 298- 372. Springer-Verlag.
  18. Tykkala, T. M., Audras, C., and Comport, A. (2011). Direct iterative closest point for real-time visual odometry. In ICCV Workshop CVVT.
Download


Paper Citation


in Harvard Style

Tykkälä T., Hartikainen H., Comport A. and Kämäräinen J. (2013). RGB-D Tracking and Reconstruction for TV Broadcasts . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2013) ISBN 978-989-8565-48-8, pages 247-252. DOI: 10.5220/0004279602470252


in Bibtex Style

@conference{visapp13,
author={Tommi Tykkälä and Hannu Hartikainen and Andrew I. Comport and Joni-Kristian Kämäräinen},
title={RGB-D Tracking and Reconstruction for TV Broadcasts},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2013)},
year={2013},
pages={247-252},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004279602470252},
isbn={978-989-8565-48-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2013)
TI - RGB-D Tracking and Reconstruction for TV Broadcasts
SN - 978-989-8565-48-8
AU - Tykkälä T.
AU - Hartikainen H.
AU - Comport A.
AU - Kämäräinen J.
PY - 2013
SP - 247
EP - 252
DO - 10.5220/0004279602470252