to relatively simple 3D interaction tasks and to isome-
tric input devices. Our results have shown that task
complexity influences several relevant aspects of in-
teraction performance, thus for even more complex
tasks (especially involving more DoF), they might be
less reliable. We hence recommend analyzing inte-
raction performance anew in case the findings should
be generalized to more than three DoF.
Further, as explained earlier, the results related to
the interaction direction down are less reliable due to
the limited comparability of the related input activi-
ties. We reported these results for reasons of comple-
teness in spite of this limitation but recommend re-
lying only on those we suggested as conclusive.
Another limitation is that the interaction directi-
ons up and back were restricted to moving back to the
initial position after having moved forwards or down
in our user study.
Future work could include repeating the study
with a different target group. We believe that the HID
setting bears higher potential for people with limited
hand mobility than for users without motor impair-
ments affecting the interacting hand. Further, we be-
lieve that a similar study with devices offering more
DoF will lead to interesting insights and expect it to
confirm the trend regarding task complexity.
REFERENCES
Augstein, M., Neumayr, T., and Burger, T. (2017a). The
Role of Haptics in User Input for People with Mo-
tor and Cognitive Impairments. In Proceedings of the
2017 AAATE Conference, Sheffield, UK.
Augstein, M., Neumayr, T., Kern, D., Kurschl, W., Alt-
mann, J., and Burger, T. (2017b). An Analysis and
Modeling Framework for Personalized Interaction. In
IUI 2017 Companion: Proceedings of the 22nd In-
ternational Conference on Intelligent User Interfaces,
Limassol, Cyprus.
Augstein, M., Neumayr, T., Vrecer, S., Kurschl, W., and
Altmann, J. (2018). The role of haptics in user input
for simple 3d interaction tasks – an analysis of inte-
raction performance and user experience. In Procee-
dings of the 2nd International Conference on Human
Computer Interaction Theory and Applications, Fun-
chal, Madeira, Portugal.
Bau, O., Poupyrev, I., Israr, A., and Harrison, C. (2010).
Teslatouch: electrovibration for touch surfaces. In
Proceedings of the 23nd annual ACM symposium on
User interface software and technology, pages 283–
292. ACM.
Carter, T., Seah, S. A., Long, B., Drinkwater, B., and Subra-
mania, S. (2013). Ultra haptics: Multi-point mid-air
haptic feedback for touch surfacces. In Proceedings
of the 26th Annual ACM Symposium on User Inter-
face Software and Technology, pages 505–514. ACM.
Ciesla, C., Yairi, M., and Saal, N. (2013). User interface
system. US Patent 8,456,438.
Dangeti, S., Chen, Y. V., and Zheng, C. (2016). Compa-
ring bare-hand-in-air gesture and object-in-hand tan-
gible user interaction for navigation of 3d objects in
modeling. In Proceedings of the TEI’16: Tenth In-
ternational Conference on Tangible, Embedded, and
Embodied Interaction, pages 417–421. ACM.
Kincaid, R. (2012). Tactile guides for touch screen con-
trols. In Proceedings of the 26th Annual BCS Inte-
raction Specialist Group Conference on People and
Computers, BCS-HCI ’12, pages 339–344, Swinton,
UK, UK. British Computer Society.
McDonald, A., Picco, B. R., Belbeck, A. L., Chow, A. Y.,
and Dickerson, C. R. (2012). Spatial dependency of
shoulder muscle demands in horizontal pushing and
pulling. Applied ergonomics, 43(6):971–978.
Pfeiffer, M., Schneegass, S., Alt, F., and Rohs, M. (2014).
Let me grab this: A comparison of ems and vibration
for haptic feedback in free-hand interaction. In Pro-
ceedings of the 5th Augmented Human International
Conference. ACM.
Rinck, M. and Becker, E. S. (2007). Approach and avoi-
dance in fear of spiders. Journal of behavior therapy
and experimental psychiatry, 38(2):105–120.
Sato, M., Poupyrev, I., and Harrison, C. (2012). Touch
´
e: en-
hancing touch interaction on humans, screens, liquids,
and everyday objects. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems,
pages 483–492. ACM.
Shen, Y., Ong, S.-K., and Nee, A. Y. (2011). Vision-
based hand interaction in augmented reality environ-
ment. Intl. Journal of Human–Computer Interaction,
27(6):523–544.
Stannus, S., Rolf, D., Lucieer, A., and Chinthammit, W.
(2011). Gestural navigation in google earth. In Pro-
ceedings of the 23rd Australian Computer-Human In-
teraction Conference, pages 269–272. ACM.
Strasser, H., Keller, E., M
¨
uller, K.-W., and Ernst, J. (1989).
Local muscular strain dependent on the direction of
horizontal arm movements. Ergonomics, 32(7):899–
910.
Strasser, H. and M
¨
uller, K.-W. (1999). Favorable mo-
vements of the hand-arm system in the horizontal
plane assessed by electromyographic investigations
and subjective rating. International Journal of Indus-
trial Ergonomics, 23(4):339–347.
Tscharn, R., Schaper, P., Sauerstein, J., Steinke, S., Stiers-
dorfer, S., Scheller, C., and Huynh, H. T. (2016). User
Experience of 3D Map Navigation – Bare-Hand Inte-
raction or Touchable Device? In Mensch und Compu-
ter 2016. GI.
Zhai, S. (2008). Human Performance in Six Degree of Free-
dom Input Control. PhD thesis, University of Toronto.
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
33