6 CONCLUSION AND FUTURE
WORK
This paper presented an approach to a more
universal definition of gesture interactions and
gesture types, respectively. Its purpose is to facilitate
communication within the development process
based on a common understanding of gesture terms.
Indeed, in our current work on gesture-based
systems, previously encountered misunderstandings
could be avoided with the help of the terms and
notation proposed in this paper. In addition, in our
university lectures the students benefited from a
resulting common understanding of the most
important aspects when designing gesture
interaction. All in all compared to the various
taxonomies from the literature, our approach appears
to us to be more practical.
In contrast to existing taxonomies our definitions
distinguish between UI reactions (feedback) and
system reactions (functionality). A further central
extension to existing gesture taxonomies is the
utilization of temporal intervals of execution of
gestures (body movements), UI feedback and system
functionality, and the relations between the intervals.
The previous section introduced and outlined
some extensions that will be investigated in more
detail in future work.
Concerning future work, the criteria shown
above are to be used in more projects and evaluated
in further studies and it should be checked if further
dependencies between single criteria can be found.
The extensions shown in this paper like further
segmentation of the gesture execution and the use of
delays seem particularly interesting. Furthermore,
the use of further criteria (e.g. gesture styles) should
be considered as another extension of our approach.
ACKNOWLEDGEMENTS
This research was financially supported by the
German Federal Ministry of Education and Research
within the program “Forschung an Fachhochschulen
– IngenieurNachwuchs” (project no. 03FH007IX5).
REFERENCES
Allen, J. F. 1983. Maintaining Knowledge about Temporal
Intervals. Communication of the ACM, November
1983. Vol. 26 Nr. 11, 832 – 843
Hartson, H. R., A. C. Siochi, and D. Hix. 1990. The UAN:
A user-oriented representation for direct manipulation
interface designs. ACM Transactions on Information
Systems (TOIS), 8(3), 181-203.
Hix, D. and H.R. Hartson. 1993. Developing User
Interfaces: Ensuring Usability Through Product and
Process. John Wiley & Sons, Inc.
Hummels, C., and P. J. Stappers. 1998. Meaningful
gestures for human computer interaction: beyond hand
postures. Third IEEE Int. Conf. Autom. Face Gesture
Recognit.
Karam, M., and M. C. Schraefel. 2005. A Taxonomy of
Gestures in Human Computer Interactions. Tech.
Report, Eletronics Comput. Sci., 1–45.
Künkel, D. , Bomsdorf, B., Röhrig, R., Ahlbrandt, J., and
M. Weigang. 2015. Participative Development of
Touchless User Interfaces: Elicitation and Evaluation
of Contactless Hand Gestures for Anesthesia.
International Conferences Interfaces and Human
Computer Interaction, 43–50.
Loke, L., Larssen, A.T. and T. Robertson. 2005.
Labanotation for design of movement-based
interaction. Proceedings of the second Australasian
conference on Interactive entertainment. Creativity &
Cognition Studios Press, Sydney, Australia, 113-120.
Microsoft. 2013. Visual Gesture Builder: A Data-Driven
Solution to Gesture Detection. [online] Available at:
http://aka.ms/k4wv2vgb [Accessed 12 Sep. 2017].
Mitra, S., and T. Acharya. 2007. Gesture recognition: A
survey. IEEE Transactions on Systems, Man, and
Cybernetics, Part C (Applications and Reviews),
37(3), 311-324.
Nielsen, J. 1995. 10 usability heuristics for user interface
design. Nielsen Norman Group, vol. 1, no. 1.
Nielsen, M., M. Störring, T. B. Moeslund, and E. Granum.
2004. A procedure for developing intuitive and
ergonomic gesture interfaces for HCI. Gesture-Based
Commun. Human-Computer Interact., 409–420.
Obaid, M., F. Kistler, M. Häring, R. Bühling, and E.
André. 2014. A Framework for User-Defined Body
Gestures to Control a Humanoid Robot. International
Journal of Social Robotics, 6(3), 383-396.
Pavlovic, V. I., R. Sharma, and T. S. Huang, 1997. Visual
interpretation of hand gestures for human-computer
interaction: A review. IEEE Transactions on pattern
analysis and machine intelligence, 19(7), 677-695.
Ruiz, J., Y. Li, and E. Lank. 2011. User-defined motion
gestures for mobile interaction. Annu. Conf. Hum.
factors Comput. Syst. - CHI ’11, 197–206.
Saffer, D. 2009. Designing Gestural Interfaces.
Wobbrock, J. O., H. H. Aung, B. Rothrock, and B. A.
Myers. 2005. Maximizing the guessability of symbolic
input. In CHI'05 extended abstracts on Human
Factors in Computing Systems, 1869-1872, ACM.
Wobbrock, J. O., M. R. Morris, and A. D. Wilson. 2009.
User-defined gestures for surface computing. 27th Int.
Conf. Hum. factors Comput. Syst. - CHI 09, 1083–
1092.
Definition of Gesture Interactions based on Temporal Relations
123