Studying the User Experience with a Multimodal Pedestrian Navigation Assistant

Gustavo Rovelo, Francisco Abad, M. C.- Juan, Emilio Camahort

Abstract

The widespread usage of mobile devices together with their computational capabilities enables the implementation of novel interaction techniques to improve user performance in traditional mobile applications. Navigation assistance is an important area in the mobile domain, and probably Google Maps is the most popular example. This type of applications is highly demanding for user’s attention, especially in the visual channel. Tactile and auditory feedback have been studied as alternatives to visual feedback for navigation assistance to reduce this dependency. However, there is still room for improvement and more research is needed to understand, for example, how the three feedback modalities complement each other, especially with the appearance of new technology such as smart watches and new displays such as Google Glass. The goal of our work is to study how the users perceives multimodal feedback when their route is augmented with directional cues. Our results show that tactile guidance cues produced the worst user performance, both objectively and subjectively. Participants reported that vibration patterns were hard to decode. However, tactile feedback was an unobtrusive technique to inform participants when to look to the mobile screen or listen to the spoken directions. The results show that combining feedback modalities produces good user performance.

References

  1. Chittaro, L. and Burigat, S. (2005). Augmenting Audio Messages with Visual Directions in Mobile Guides: An Evaluation of Three Approaches. In Proc. of the 7th Int. Conf. on Human Computer Interaction with Mobile Devices and Services, MobileHCI, pages 107- 114.
  2. Fröhlich, P., Oulasvirta, A., Baldauf, M., and Nurminen, A. (2011). On the Move, Wirelessly Connected to the World. Com. of the ACM, 54(1):132-138.
  3. Jacob, R., Mooney, P., Corcoran, P., and Winstanley, A. C. (2011a). Guided by Touch: Tactile Pedestrian Navigation. In Proc. of the GIS Research UK 19th Annual Conf., GISRUK, pages 205-215.
  4. Jacob, R., Mooney, P., and Winstanley, A. C. (2011b). Guided by Touch: Tactile Pedestrian Navigation. In Proc. of the 1st Int. Workshop on Mobile Locationbased Service, MLBS, pages 11-20.
  5. Jameson, A. (2002). Usability Issues and Methods for Mobile Multimodal Systems. In Proc. of the ISCA Tutorial and Research Workshop on Multi-Modal Dialogue in Mobile Environments.
  6. Liljedahl, M., Lindberg, S., Delsing, K., Polojärvi, M., Saloranta, T., and Alakärppä, I. (2012). Testing Two Tools for Multimodal Navigation. Adv. in HumanComputer Interaction, 2012.
  7. Liljedahl, M. and Papworth, N. (2012). Using Sound to Enhance Users' Experiences of Mobile Applications. In Proc. of the 7th Audio Mostly Conf.: A Conference on Interaction with Sound, AM, pages 24-31.
  8. Magnusson, C., Molina, M., Rassmus-Gröhn, K., and Szymczak, D. (2010). Pointing for Non-visual Orientation and Navigation. In Proc. of the 6th Nordic Conf. on Human-Computer Interaction: Extending Boundaries, NordiCHI, pages 735-738.
  9. Oulasvirta, A., Tamminen, S., Roto, V., and Kuorelahti, J. (2005). Interaction in 4-second bursts: The fragmented nature of attentional resources in mobile hci. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, CHI, pages 919-928.
  10. Pielot, M. and Boll, S. (2010). Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems. In Proc. of the Int. Conf. on Pervasive Computing, Pervasive, pages 76- 93.
  11. Pielot, M., Henze, N., and Boll, S. (2009). Supporting Map-based Wayfinding with Tactile Cues. In Proc. of the 11th Int. Conf. on Human-Computer Interaction with Mobile Devices and Services, MobileHCI, pages 23:1-23:10.
  12. Pielot, M., Heuten, W., Zerhusen, S., and Boll, S. (2012a). Dude, Where's My Car?: In-situ Evaluation of a Tactile Car Finder. In Proc. of the 7th Nordic Conf. on Human-Computer Interaction: Making Sense Through Design, NordiCHI, pages 166-169.
  13. Pielot, M., Poppinga, B., and Boll, S. (2010). Pocketnavigator: vibro-tactile waypoint navigation for everyday mobile devices. In Proc of the Conf. on HumanComputer Interaction with Mobile Devices and Services, Mobile HCI, pages 423-426.
  14. Pielot, M., Poppinga, B., Heuten, W., and Boll, S. (2012b). PocketNavigator: Studying Tactile Navigation Systems In-situ. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, CHI, pages 3131- 3140.
  15. Raisamo, R., Nukarinen, T., Pystynen, J., Mäkinen, E., and Kildal, J. (2012). Orientation Inquiry: A New Haptic Interaction Technique for Non-visual Pedestrian Navigation. In Proc. of the 2012 Int. Conf. on Haptics: Perception, Devices, Mobility, and Communication - Volume Part II, EuroHaptics, pages 139-144.
  16. Robinson, S., Jones, M., Eslambolchilar, P., Murray-Smith, R., and Lindborg, M. (2010). ”I Did It My Way”: Moving Away from the Tyranny of Turn-by-turn Pedestrian Navigation. In Proc. of the 12th Int. Conf. on Human Computer Interaction with Mobile Devices and Services, MobileHCI, pages 341-344.
  17. Rümelin, S., Rukzio, E., and Hardy, R. (2011). NaviRadar: A Novel Tactile Information Display for Pedestrian Navigation. In Proc. of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST, pages 293-302.
  18. Strachan, S., Eslambolchilar, P., Murray-Smith, R., Hughes, S., and O'Modhrain, S. (2005). GpsTunes: Controlling Navigation via Audio Feedback. In Proc. of the 7th Int. Conf. on Human Computer Interaction with Mobile Devices and Services, MobileHCI, pages 275- 278.
  19. Szymczak, D., Magnusson, C., and Rassmus-Gröhn, K. (2012). Guiding Tourists Through Haptic Interaction: Vibration Feedback in the Lund Time Machine. In Proc. of the 2012 Int. Conf. on Haptics: Perception, Devices, Mobility, and Communication - Volume Part II, EuroHaptics, pages 157-162.
  20. Vainio, T. (2009). Exploring Multimodal Navigation Aids for Mobile Users. In Proc. of the 12th IFIP TC 13 Int. Conf. on Human-Computer Interaction: Part I, INTERACT, pages 853-865.
Download


Paper Citation


in Harvard Style

Rovelo G., Abad F., Juan M. and Camahort E. (2015). Studying the User Experience with a Multimodal Pedestrian Navigation Assistant . In Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015) ISBN 978-989-758-087-1, pages 438-445. DOI: 10.5220/0005297504380445


in Bibtex Style

@conference{grapp15,
author={Gustavo Rovelo and Francisco Abad and M. C.- Juan and Emilio Camahort},
title={Studying the User Experience with a Multimodal Pedestrian Navigation Assistant},
booktitle={Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015)},
year={2015},
pages={438-445},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005297504380445},
isbn={978-989-758-087-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2015)
TI - Studying the User Experience with a Multimodal Pedestrian Navigation Assistant
SN - 978-989-758-087-1
AU - Rovelo G.
AU - Abad F.
AU - Juan M.
AU - Camahort E.
PY - 2015
SP - 438
EP - 445
DO - 10.5220/0005297504380445