Authors:
Takuro Oki
1
;
Ryusuke Miyamoto
2
;
Hiroyuki Yomo
3
and
Shinsuke Hara
4
Affiliations:
1
Department of Computer Science, Graduate School of Science and Technology and Japan
;
2
Department of Computer Science, School of Science and Technology, Meiji University, 1-1-1 Higashimita, Tama-ku, Kawasaki-shi and Japan
;
3
Department of Electrical and Electronic Engineering, Faculty of Engineering Science, Kansai University, 3-3-35 Yamate-cho, Suita-shi and Japan
;
4
Graduate School of Engineering, Osaka City University, 3-3-138 Sugimoto Sumiyoshi-ku, Osaka-shi and Japan
Keyword(s):
Player Detection, Aerial Images, Informed-Filters.
Related
Ontology
Subjects/Areas/Topics:
Computer Systems in Sports
;
Multimedia and Information Technology
;
Sport Science Research and Technology
Abstract:
To realize real-time vital sensing during exercise using wearable sensors attached to players, a novel multi-hop routing scheme is required. To solve this problem, image assisted routing that estimates the locations of sensor nodes based on images captured from cameras on UAVs is proposed. However, it is not clear where is the best view points for player detection in aerial images. In this paper, the authors have investigated the detection accuracy according to several view points using aerial images with annotations generated from the CG-based dataset. Experimental results show that the detection accuracy became best when the view points were slightly distant from just above the center of the field. In the best case, the detection accuracy became very good: 0.005524 miss rate at 0.01 FPPI.