Authors:
Gustavo Rovelo
1
;
Francisco Abad
2
;
M. C.- Juan
2
and
Emilio Camahort
2
Affiliations:
1
Hasselt University and Universitat Politècnica de València, Belgium
;
2
Universitat Politècnica de València, Spain
Keyword(s):
Multimodal Interface, User Evaluation, Mobile Augmented Reality.
Related
Ontology
Subjects/Areas/Topics:
Advanced User Interfaces
;
Augmented, Mixed and Virtual Environments
;
Computer Vision, Visualization and Computer Graphics
;
Interactive Environments
Abstract:
The widespread usage of mobile devices together with their computational capabilities enables the implementation of novel interaction techniques to improve user performance in traditional mobile applications. Navigation assistance is an important area in the mobile domain, and probably Google Maps is the most popular example. This type of applications is highly demanding for user’s attention, especially in the visual channel. Tactile and auditory feedback have been studied as alternatives to visual feedback for navigation assistance to reduce this dependency. However, there is still room for improvement and more research is needed to understand, for example, how the three feedback modalities complement each other, especially with the appearance of new technology such as smart watches and new displays such as Google Glass. The goal of our work is to study how the users perceives multimodal feedback when their route is augmented with directional cues. Our results show that tactile guid
ance cues produced the worst user performance, both objectively and subjectively. Participants reported that vibration patterns were hard to decode. However, tactile feedback was an unobtrusive technique to inform participants when to look to the mobile screen or listen to the spoken directions. The results show that combining feedback modalities produces good user performance.
(More)