to adapt reduce the naturalness of the interaction
and the sense of presence. Moreover, we wanted to
evaluate if our grasp function could allow a natural
and efficient interaction with virtual objects. Many
participants also reported that the possibility to
see a virtual representation of their hands and use
them to interact with the surrounding environment,
made them feel more confident, as it enhanced their
capability to estimate distances. Anyway, further
investigations on this topic are required. One of the
main aspects that people pointed out was the absence
of physics rules, so it would be interesting to investi-
gate how to add gravity or collision between objects
without losing stability. Finally, we would like to
refine the parameter controlling the grasp action
and adapt it to the shape of different objects. This
could also potentially solve the object-hand interpen-
etrating problem still maintaining good performances.
REFERENCES
Chastine, J., Kosoris, N., and Skelton, J. (2013). A study
of gesture-based first person control. In Computer
Games: AI, Animation, Mobile, Interactive Multime-
dia, Educational & Serious Games (CGAMES), 18th
International Conference on, pages 79–86. IEEE.
Cheng, B., Ketcheson, M., van der Kroon, J., and Graham,
T. (2015). Corgi defence: Building in a virtual reality
environment. In Proceedings of the Annual Sympo-
sium on Computer-Human Interaction in Play, pages
763–766. ACM.
Chessa, M., Maiello, G., Borsari, A., and Bex, P. J. (2016).
The perceptual quality of the Oculus Rift for immer-
sive virtual reality. Human–Computer Interaction,
pages 1–32.
Cordeil, M., Bach, B., Li, Y., Wilson, E., and Dwyer, T.
(2017). A design space for spatio-data coordination:
Tangible interaction devices for immersive informa-
tion visualisation. In Proceedings of IEEE Pacific Vi-
sualization Symposium (Pacific Vis).
Cuervo, E. (2017). Beyond reality: Head-mounted displays
for mobile systems researchers. GetMobile: Mobile
Computing and Communications, 21(2):9–15.
Just, M. A., Stirling, D., Naghdy, F., Ros, M., and Stap-
ley, P. J. (2016). A comparison of upper limb move-
ment profiles when reaching to virtual and real targets
using the Oculus Rift: implications for virtual-reality
enhanced stroke rehabilitation. Journal of Pain Man-
agement, 9(3):277–281.
Kennedy, R. S., Lane, N. E., Berbaum, K. S., and Lilien-
thal, M. G. (1993). Simulator sickness questionnaire:
An enhanced method for quantifying simulator sick-
ness. The international journal of aviation psychol-
ogy, 3(3):203–220.
Khundam, C. (2015). First person movement control with
palm normal and hand gesture interaction in virtual
reality. In Computer Science and Software Engineer-
ing (JCSSE), 12th International Joint Conference on,
pages 325–330. IEEE.
Park, G., Choi, H., Lee, U., and Chin, S. (2017). Virtual fig-
ure model crafting with VR HMD and Leap Motion.
The Imaging Science Journal, 65(6):358–370.
Prachyabrued, M. and Borst, C. W. (2012). Visual inter-
penetration tradeoffs in whole-hand virtual grasping.
In 3D User Interfaces (3DUI), IEEE Symposium on,
pages 39–42. IEEE.
Prachyabrued, M. and Borst, C. W. (2014). Visual feedback
for virtual grasping. In 3D User Interfaces (3DUI),
2014 IEEE Symposium on, pages 19–26. IEEE.
Pulijala, Y., Ma, M., and Ayoub, A. (2017). VR surgery:
Interactive virtual reality application for training oral
and maxillofacial surgeons using Oculus Rift and
Leap Motion. In Serious Games and Edutainment Ap-
plications, pages 187–202. Springer.
Reddivari, S., Smith, J., and Pabalate, J. (2017). VRvisu:
A tool for virtual reality based visualization of medi-
cal data. In Connected Health: Applications, Systems
and Engineering Technologies (CHASE), IEEE/ACM
International Conference on, pages 280–281. IEEE.
Sportillo, D., Paljic, A., Boukhris, M., Fuchs, P., Ojeda, L.,
and Roussarie, V. (2017). An immersive virtual re-
ality system for semi-autonomous driving simulation:
a comparison between realistic and 6-DoF controller-
based interaction. In Proceedings of the 9th Interna-
tional Conference on Computer and Automation En-
gineering, pages 6–10. ACM.
Su
ˇ
znjevi
´
c, M., Mandjurov, M., and Matija
ˇ
sevi
´
c, M. (2017).
Performance and QoE assessment of HTC Vive and
Oculus Rift for pick-and-place tasks in VR. In Ninth
International Conference on Quality of Multimedia
Experience (QoMEX).
Weichert, F., Bachmann, D., Rudak, B., and Fisseler, D.
(2013). Analysis of the accuracy and robustness of the
Leap Motion controller. Sensors, 13(5):6380–6393.
Wozniak, P., Vauderwange, O., Mandal, A., Javahiraly, N.,
and Curticapean, D. (2016). Possible applications
of the Leap Motion controller for more interactive
simulated experiments in augmented or virtual real-
ity. In Optics Education and Outreach IV, volume
9946, page 99460P. International Society for Optics
and Photonics.
Young, M. K., Gaylor, G. B., Andrus, S. M., and Bo-
denheimer, B. (2014). A comparison of two cost-
differentiated virtual reality systems for perception
and action tasks. In Proceedings of the ACM Sym-
posium on Applied Perception, pages 83–90. ACM.
Zhang, F., Chu, S., Pan, R., Ji, N., and Xi, L. (2017). Dou-
ble hand-gesture interaction for walk-through in VR
environment. In Computer and Information Science
(ICIS), IEEE/ACIS 16th International Conference on,
pages 539–544. IEEE.
Studying Natural Human-computer Interaction in Immersive Virtual Reality: A Comparison between Actions in the Peripersonal and in the
Near-action Space
115