we can identify functionality gaps in the current
SDKs.
The upcoming survey will be significant because
360-degree videos are gaining popularity among con-
sumers, the developers are early adopters of technol-
ogy and there are relatively few scientific publications
about choosing 360-degree video SDKs.
This work is an initial study for a research project
called 360 Video Intelligence. The purpose of the
project is to create a 360-degree video platform which
provides an easy way to run different kinds of analy-
sis, for example object detection algorithms, on 360-
degree videos. The videos with added metadata will
be then played on 360-degree video player applica-
tion. However, the player will not only play the video
with visualized metadata but it will also gather user
log for further analysis. Practical use cases for user
logging include view port prediction that can be used
for example on providing better video resolution only
to the field of view similarly to work presented in
(Ochi et al., 2014). We will also need some kind of UI
elements for visualizing the added metadata on 360-
degree videos. With the knowledge gained from de-
veloping our criteria and the following survey, we can
have a better understanding about developing such ap-
plications.
ACKNOWLEDGEMENTS
This study was made in a research project called 360
Video Intelligence. We would like to thank TEKES
for funding the project.
REFERENCES
Alface, P. R., Macq, J.-F., and Verzijp, N. (2011). Evalu-
ation of bandwidth performance for interactive spher-
ical video. In Multimedia and Expo (ICME), 2011
IEEE International Conference on, pages 1–6. IEEE.
Argyriou, L., Economou, D., Bouki, V., and Doumanis, I.
(2016). Engaging immersive video consumers: Chal-
lenges regarding 360-degree gamified video applica-
tions. In Ubiquitous Computing and Communications
and 2016 International Symposium on Cyberspace
and Security (IUCC-CSS), International Conference
on, pages 145–152. IEEE.
Bierbaum, A. and Just, C. (1998). Software tools for vir-
tual reality application development. Course Notes for
SIGGRAPH, 98.
Dalmasso, I., Datta, S. K., Bonnet, C., and Nikaein, N.
(2013). Survey, comparison and evaluation of cross
platform mobile application development tools. In
Wireless Communications and Mobile Computing
Conference (IWCMC), 2013 9th International, pages
323–328. IEEE.
Fagerholm, F. and M
¨
unch, J. (2012). Developer experi-
ence: Concept and definition. In Software and System
Process (ICSSP), 2012 International Conference on,
pages 73–77. IEEE.
LaValle, S. M., Yershova, A., Katsev, M., and Antonov,
M. (2014). Head tracking for the oculus rift. In
Robotics and Automation (ICRA), 2014 IEEE Inter-
national Conference on, pages 187–194. IEEE.
Linowes, J. and Schoen, M. (2016). Cardboard VR Projects
for Android. Packt Publishing Ltd.
Nykaza, J., Messinger, R., Boehme, F., Norman, C. L.,
Mace, M., and Gordon, M. (2002). What program-
mers really want: results of a needs assessment for sdk
documentation. In Proceedings of the 20th annual in-
ternational conference on Computer documentation ,
pages 133–141. ACM.
Ochi, D., Kunita, Y., Fujii, K., Kojima, A., Iwaki, S.,
and Hirose, J. (2014). Hmd viewing spherical video
streaming system. In Proceedings of the 22nd ACM
international conference on Multimedia, pages 763–
764. ACM.
Palme, E., Tan, C.-H., Sutanto, J., and Phang, C. W. (2010).
Choosing the smart phone operating system for devel-
oping mobile applications. In Proceedings of the 12th
International Conference on Electronic Commerce:
Roadmap for the Future of Electronic Business, pages
146–152. ACM.
Shibata, T. (2002). Head mounted display. Displays,
23(1):57–64.
Yucel, I. H. and Edgell, R. A. (2015). Conceptualizing fac-
tors of adoption for head mounted displays: Toward
an integrated multi-perspective framework. Journal
For Virtual Worlds Research, 8(2).
SIGMAP 2017 - 14th International Conference on Signal Processing and Multimedia Applications
86