Figure 11: Evaluation of our method on different trajectories within a grid. The grid has a size of 6 × 6 meters. The patterns
were chosen to fully cover the tracking area in different directions. Moreover, different types of movement were used during
the evaluation, such as walking, jumping, sliding etc.
Interactive Media Systems group at TU Wien for all
the necessary hardware and laboratory space during
the project. Our project was also partly supported by
Comenius University grant No. UK/293/2017.
REFERENCES
Andriluka, M., Roth, S., and Schiele, B. (2010). Monocular
3d pose estimation and tracking by detection. In 2010
IEEE Computer Society Conference on Computer Vi-
sion and Pattern Recognition, pages 623–630.
Anguelov, D., Srinivasan, P., Koller, D., Thrun, S., Rodgers,
J., and Davis, J. (2005). Scape: Shape completion and
animation of people. ACM Trans. Graph., 24(3):408–
416.
Bærentzen, J., Misztal, M., and Wełnicka, K. (2012). Con-
verting skeletal structures to quad dominant meshes.
Computers & Graphics, 36(5):555 – 561. Shape Mod-
eling International (SMI) Conference 2012.
Cheung, G. K. M., Baker, S., and Kanade, T. (2003). Shape-
from-silhouette of articulated objects and its use for
human body kinematics estimation and motion cap-
ture. In Proceedings of the 2003 IEEE Computer
Society Conference on Computer Vision and Pattern
Recognition, CVPR’03, pages 77–84, Washington,
DC, USA. IEEE Computer Society.
Dimitrijevic, M., Lepetit, V., and Fua, P. (2006). Human
body pose detection using Bayesian spatio-temporal
templates. Comput. Vis. Image Underst., 104(2):127–
139.
Helten, T., Muller, M., Seidel, H.-P., and Theobalt, C.
(2013). Real-time body tracking with one depth cam-
era and inertial sensors. In The IEEE International
Conference on Computer Vision (ICCV).
Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe,
R., Kohli, P., Shotton, J., Hodges, S., Freeman, D.,
Davison, A., and Fitzgibbon, A. (2011). Kinectfu-
sion: Real-time 3d reconstruction and interaction us-
ing a moving depth camera. In Proceedings of the
24th Annual ACM Symposium on User Interface Soft-
ware and Technology, UIST ’11, pages 559–568, New
York, NY, USA. ACM.
Julier, S. J. and Uhlmann, J. K. (2004). Unscented filtering
and nonlinear estimation. Proceedings of the IEEE,
92(3):401–422.
Katz, I. and Aghajan, H. (2008). Multiple camera-based
chamfer matching for pedestrian detection. In 2008
Second ACM/IEEE International Conference on Dis-
tributed Smart Cameras, pages 1–5.
Pons-Moll, G., Baak, A., Helten, T., M
¨
uller, M., Seidel, H.-
P., and Rosenhahn, B. (2010). Multisensor-fusion for
3d full-body human motion capture. In IEEE Con-
ference on Computer Vision and Pattern Recognition
(CVPR).
Rhodin, H., Richardt, C., Casas, D., Insafutdinov, E.,
Shafiei, M., Seidel, H., Schiele, B., and Theobalt,
C. (2017). Egocap: Egocentric marker-less motion
capture with two fisheye cameras (extended abstract).
CoRR, abs/1701.00142.
Shotton, J., Sharp, T., Kipman, A., Fitzgibbon, A., Finoc-
chio, M., Blake, A., Cook, M., and Moore, R. (2013).
Real-time human pose recognition in parts from single
depth images. Commun. ACM, 56(1):116–124.
Skogstad, S. A., Nymoen, K., and Hvin, M. (2011). Com-
paring inertial and optical mocap technologies for
synthesis control. In 2011 IEEE Conference on Sys-
tems, Man and Cybernetics.
Stoll, C., Hasler, N., Gall, J., Seidel, H.-P., and Theobalt, C.
(2011). Fast articulated motion tracking using a sums
of gaussians body model. In Computer Vision (ICCV),
2011 IEEE International Conference on, pages 951–
958. IEEE.
von Marcard, T., Pons-Moll, G., and Rosenhahn, B. (2016).
Human pose estimation from video and imus. Trans-
actions on Pattern Analysis and Machine Intelligence,
38(8):1533–1547.
Wojek, C., Walk, S., and Schiele, B. (2009). Multi-cue on-
board pedestrian detection. In 2009 IEEE Conference
on Computer Vision and Pattern Recognition, pages
794–801.
Ziegler, J., Kretzschmar, H., Stachniss, C., Grisetti, G., and
Burgard, W. (2011). Accurate human motion capture
in large areas by combining imu- and laser-based peo-
ple tracking. In IROS, pages 86–91. IEEE.
Optical-inertial Synchronization of MoCap Suit with Single Camera Setup for Reliable Position Tracking
47