task than participants using the multimodal interac-
tion paradigm. This can be interpreted as an effect
of a reduction of the cognitive load that weighs on the
user when using commands integrated across multiple
modalities. Another significant difference was found
in terms of robustness: Participants were asked to rate
how stable the interaction paradigm worked for them.
This indicates that due to the multimodal paradigm
only reacting to an interaction if multiple modalities
are used at once, the error rate can be reduced – cre-
ating a more robust solution than with a unimodal in-
teraction paradigm. With the unimodal paradigm, a
single movement could already be falsely recognized
as intended gesture.
Although the user study was performed with a
fairly limited number of participants (10 in total), it is
argued that a user study with 5 participants fully suf-
fices in most cases (Nielsen, 2000). In addition, the
statistical analysis used in the present work (student’s
t–test) takes the number of participants into account
(Haynes, 2013).
Since even the most complex task in the present
work, the creation of objects with the use of sys-
tem control, is relatively simplistic, studies with more
complex tasks are required to further approve the re-
sults of the present work. Yet the results already show
that there is a highly significant difference between
the two paradigms and that multimodal interaction
techniquescan improvethe efficiencyof a natural user
interface.
The results gained in this work are related to an in-
teractive scientific data visualization application. Al-
though the interaction tasks the user had to fulfill in
the user study are basically applicable to other appli-
cations different from scientific data visualization or
from different scientific fields, it is still an open ques-
tion if these results also apply to other applications
of human computer interaction with different input-
and output modalities. Hence, it would be interesting
to use different modalities, or to add more modalities
such as eye gaze or haptic. Repeating the user study
with a larger number of participants or with the use of
different interaction techniques and different or more
complex interaction tasks is necessary to get a deeper
understanding of the general benefits of multimodal
interaction techniques and to verify the outcome of
this work.
REFERENCES
Arabzadeh, E., Clifford, C. W., and Harris, J. A. (2008). Vi-
sion Merges With Touch in a Purely Tactile Discrimi-
nation. Psychological Science, 19(7).
Bolt, R. (1980). Put–That–There. In Proceedings of the 7th
annual conference on Computer graphics and inter-
active techniques, pages 262–270.
Bowman, D., Kruijff, E., Laviola, J., and Poupyrev, I.
(2001). An Introduction to 3–D User Interface De-
sign. Presence, 10(1).
Bryson, S. (1996). Virtual Reality in Scientific Visualiza-
tion. Commununications of ACM, 39(5):62–71.
Cohen, P., Johnston, M., McGee, D., Oviatt, S., Pittman, J.,
Smith, I., Chen, L., and Clow, J. (1997). QuickSet:
Multimodal Interaction for Distributed Applications.
In Proceedings of the fifth ACM international confer-
ence on Multimedia, pages 31–40.
Haynes, W. (2013). Student’s t-Test. In Encyclopedia
of Systems Biology, pages 2023–2025. Springer New
York.
J´ego, J., Paljic, A., and Fuchs, P. (2013). User-Defined Ges-
tural Interaction: a Study on Gesture Memorization.
In IEEE Symposium on 3D User Interfaces (3DUI),
pages 7–10.
Nielsen, J. (2000). Why You Only Need to Test with
5 Users. http://www.nngroup.com/articles/why-you-
only-need-to-test-with-5-users/. [Online; accessed
last on 16-12-2014].
Nigay, L. and Coutaz, J. (1993). A Design Space For Multi-
modal Systems: Concurrent Processing and Data Fu-
sion. In Conference on Human Factors in Computing
Systems, pages 172–178.
Oviatt, S. (1999). Ten Myths of Multimodal Interaction.
Commun. ACM, 42(11):74–81.
Oviatt, S., Lunsford, R., and Coulston, R. (2005). Indi-
vidual differences in multimodal integration patterns:
What are they and why do they exist? In Proceed-
ings of the SIGCHI conference on Human factors in
computing systems, pages 241–249. ACM.
Turk, M. (2013). Multimodal Interaction: A review. Pattern
Recognition, 36:189–195.
van Dam, A. (1997). Post–WIMP User Interfaces. Commu-
nications of the ACM, 40(2).
Xiao, B., Girand, C., and Oviatt, S. L. (2002). Multimodal
integration patterns in children. In 8th International
Conference on Spoken Language Processing, Korea.
Xiao, B., Lunsford, R., Coulston, R., Wesson, M., and Ovi-
att, S. (2003). Modeling multimodal integration pat-
terns and performance in seniors: Toward adaptive
processing of individual differences. In Proceedings
of the 5th international conference on Multimodal in-
terfaces, pages 265–272. ACM.
MultimodalInteractionTechniquesinScientificDataVisualization-AnAnalyticalSurvey
437