(a) Bounce position estimation in 2-D space.
(b) View from the camera No. 11 in Fig. 4
(c) View from the camera No. 14 in Fig. 4
Figure 10: Bounce position estimation result for the trajec-
tory data of Fig. 8 (4).
extracted candidate 2-D position by two-view recon-
struction for every image pairs. By analyzing a distri-
bution of the reconstructed 3-D points, we decided a
centroid of the cluster as the 3-D ball position. More-
over, we parameterized the 3-D ball trajectories by
fitting two parabolas to them.
In our simulation and real video experiments, we
confirmed that our method stably estimated the 3-
D ball trajectory compared with a multi-view recon-
struction method. We also confirmed that a ball tra-
jectory can be accurately parameterized by simple
parabola equations. We also estimated a bounce posi-
tion by using our parameterization result.
In future works, we plan to tackle automated syn-
chronization of input video sequences by using an
epipolar constraint. We also plan that we measure a
ball speed by a speed gun and compare it with the
speed estimated from our 3-D reconstruction results.
REFERENCES
Archana, M. and Geetha, M. K. (2015). Object detec-
tion and tracking based trajectory in broadcast tennis
video. Procedia Computer Science, 58:225–232.
Fischler, M. A. and Bolles, R. C. (1981). Random sample
consensus: A paradigm for model fitting with applica-
tions to image analysis and automated cartography. In
Communications of the ACM, volume 24, pages 381–
395.
Hartley, R. and Zisserman, A. (2004). Multiple View Geom-
etry in computer vision. Cambridge University Press.
Hawk-Eye (2002). http://www.hawkeyeinnnovations.co.uk.
Kanatani, K., Sugaya, Y., and Kanazawa, Y. (2016). Guide
to 3D Vision Computation: Geometric Analysis and
Implementation. Stringier International.
Miyata, S., Saito, H., Takahashi, K., Mikami, D., Isogawa,
M., and Kimata, H. (2017). Ball 3d trajectory recon-
struction without preliminary temporal and geometri-
cal camera calibration. In IEEE Conference on Com-
puter Vision and Pattern Recognition Workshop, pages
164–169.
Polceanu, M., Petac, A. O., Lebsir, H. B., Fiter, B., and
Buche, C. (2018). Real time tennis match tracking
with low cost equipment. In The Thirty-First Interna-
tional Florida Artificial Intelligence Research Society
Conference(FLAIRS-31), pages 197–200.
Qazi, T., Mukherjee, P., Srivastava, S., Lall, B., and
Chauhan, N. R. (2015). Automated ball tracking in
tennis videos. In Third International Conference on
Image Information Processing, pages 236–240.
Sugaya, Y. (2010). Ellipse detection by combining division
and model selection based integration of edge points.
In Proceedings of the 4th Pacific-Rim Symposium on
Image and Video Technology, pages 64–69.
Takanohashi, K., Manabe, Y., Yasumuro, Y., Imura, M., ,
and Chihara, K. (2007). Measurement of 3d ball tra-
jectory using motion blur. IPSJ Transactions of Com-
puter Vision nad Image Media, 48(SIG 1):35–47.
Yan, F., Christman, W., and Kittler, J. (2005). A tennis ball
tracking algorithm for automatic annotation of tennis
match. In British Machine Vision Conference 2005.
Zhang, X., Sun, X., Yuan, Y., Zhu, Z., and Yu, Q. (2012).
Iterative determination of camera pose from line fea-
tures. In International Archives of the Photogramme-
try, Remote Sensing and Spatial Information Sciences,
volume XXXIX-B1, pages 81–86.
Zhang, Z. (2000). A flexible new technique for camera cal-
ibration. IEEE Transactions on Pattern Analysis and
Machine Intelligence, 22(11):1330–1334.