Optical Flow Refinement using Reliable Flow Propagation

Tan Khoa Mai, Michèle Gouiffes, Samia Bouchafa

2017

Abstract

This paper shows how to improve optical flow estimation by considering a neighborhood consensus strategy along with a reliable flow propagation method. Propagation takes advantages of reliability measures that are available from local low level image features. In this paper, we focus on color but our method could be easily generalized by considering also texture or gradient features. We investigate the conditions of estimating accurate optical flow and managing correctly flow discontinuities by proposing a variant of the well-known Kanade-Lucas-Tomasi (KLT) approach. Starting from this classical approach, a consensual flow is estimated locally while two additional criteria are proposed to evaluate its reliability. Propagation of reliable flow throughout the image is then performed using a specific distance criterion based on color and proximity. Experiments are conducted within the Middlebury database and show better results than classic KLT and even global methods like the well known Horn and Schunck or Black and Anandan approaches.

References

  1. Bab-Hadiashar, A. and Suter, D. (1998). Robust optic flow computation. International Journal of Computer Vision, 29(1):59-77.
  2. Baker, S., Scharstein, D., Lewis, J. P., Roth, S., Black, M. J., and Szeliski, R. (2010). A Database and Evaluation Methodology for Optical Flow. International Journal of Computer Vision, 92(1):1-31.
  3. Black, M. J. and Anandan, P. (1991). Robust dynamic motion estimation over time. In , IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1991. Proceedings CVPR 7891, pages 296-302.
  4. Black, M. J. and Anandan, P. (1996). The Robust Estimation of Multiple Motions: Parametric and PiecewiseSmooth Flow Fields. Computer Vision and Image Understanding, 63(1):75-104.
  5. Brox, T. and Malik, J. (2011). Large displacement optical flow: Descriptor matching in variational motion estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(3):500-513.
  6. Brox, T. and Weickert, J. (2002). Nonlinear Matrix Diffusion for Optic Flow Estimation. In Gool, L. V., editor, Pattern Recognition, number 2449 in Lecture Notes in Computer Science, pages 446-453. Springer Berlin Heidelberg. DOI: 10.1007/3-540-45783-6 54.
  7. Chen, Z., Jin, H., Lin, Z., Cohen, S., and Wu, Y. (2013). Large Displacement Optical Flow from Nearest Neighbor Fields. pages 2443-2450. IEEE.
  8. Farneback, G. (2000). Fast and accurate motion estimation using orientation tensors and parametric motion models. volume 1, pages 135-139. IEEE Comput. Soc.
  9. Horn, B. K. and Schunck, B. G. (1981). Determining optical flow. Artificial Intelligence, 17(1-3):185-203.
  10. Kim, T. H., Lee, H. S., and Lee, K. M. (2013). Optical Flow via Locally Adaptive Fusion of Complementary Data Costs. pages 3344-3351. IEEE.
  11. Liu, H., Chellappa, R., and Rosenfeld, A. (2003). Accurate dense optical flow estimation using adaptive structure tensors and a parametric model. IEEE Transactions on Image Processing, 12(10):1170-1180.
  12. Lucas, B. D. and Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. pages 674-679.
  13. Middendorf, M. and Nagel, H. H. (2001). Estimation and interpretation of discontinuities in optical flow fields. In Eighth IEEE International Conference on Computer Vision, 2001. ICCV 2001. Proceedings, volume 1, pages 178-183 vol.1.
  14. Nagel, H.-H. and Gehrke, A. (1998). Spatiotemporally adaptive estimation and segmentation of OF-fields. In Burkhardt, H. and Neumann, B., editors, Computer Vision - ECCV'98 , number 1407 in Lecture Notes in Computer Science, pages 86-102. Springer Berlin Heidelberg. DOI: 10.1007/BFb0054735.
  15. Sun, D., Roth, S., and Black, M. J. (2010). Secrets of optical flow estimation and their principles. pages 2432- 2439. IEEE.
  16. Weinzaepfel, P., Revaud, J., Harchaoui, Z., and Schmid, C. (2013). DeepFlow: Large displacement optical flow with deep matching. In IEEE Intenational Conference on Computer Vision (ICCV), Sydney, Australia.
  17. Xu, L., Jia, J., and Matsushita, Y. (2010). Motion detail preserving optical flow estimation. pages 1293-1300. IEEE.
  18. Yang, J. and Li, H. (2015). Dense, accurate optical flow estimation with piecewise parametric model. pages 1019-1027. IEEE.
  19. yves Bouguet, J. (2000). Pyramidal implementation of the lucas kanade feature tracker. Intel Corporation, Microprocessor Research Labs.
Download


Paper Citation


in Harvard Style

Mai T., Gouiffes M. and Bouchafa S. (2017). Optical Flow Refinement using Reliable Flow Propagation . In Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017) ISBN 978-989-758-227-1, pages 451-458. DOI: 10.5220/0006131704510458


in Bibtex Style

@conference{visapp17,
author={Tan Khoa Mai and Michèle Gouiffes and Samia Bouchafa},
title={Optical Flow Refinement using Reliable Flow Propagation},
booktitle={Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017)},
year={2017},
pages={451-458},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006131704510458},
isbn={978-989-758-227-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 6: VISAPP, (VISIGRAPP 2017)
TI - Optical Flow Refinement using Reliable Flow Propagation
SN - 978-989-758-227-1
AU - Mai T.
AU - Gouiffes M.
AU - Bouchafa S.
PY - 2017
SP - 451
EP - 458
DO - 10.5220/0006131704510458