MULTIMODAL INTERACTION WITH MOBILE DEVICES - Outline of a Semiotic Framework for Theory and Practice

Gustav Öquist

2006

Abstract

This paper explores how interfaces that fully uses our ability to communicate through the visual, auditory, and tactile senses, may enhance mobile interaction. The first step is to look beyond the desktop. We do not need to reinvent computing, but we need to see that mobile interaction does not benefit from desktop metaphors alone. The next step is to look at what we have at hand, and as we will see, mobile devices are already quite apt for multimodal interaction. The question is how we can coordinate information communicated through several senses in a way that enhances interaction. By mapping information over communication circuit, semiotic representation, and sense applied for interaction; a framework for multimodal interaction is outlined that can offer some guidance to integration. By exemplifying how a wide range of research prototypes fit into the framework today, it is shown how interfaces communicating through several modalities may enhance mobile interaction tomorrow.

References

  1. Allwood, J. (2002). Bodily Communication - Dimensions of expression and Content. In B. Granström, D. House, and I. Karlsson (Eds). Multimodality in Language and Speech Systems, 7-26. Kluwer Academic Publishers.
  2. Bersen, O. (1994). Foundations of multimodal representations. A taxonomy of representational modalities, Interacting with Computers, 6(4), 347-371.
  3. Brewster, S.A. (1998). Using non speech sounds to provide navigation cues. ACM Transactions on Computer-Human Interaction, 5(3), 224-259.
  4. Chandler, D. (2001). Semiotics: The Basics. New York: Routledge.
  5. Fishkin, K. P., Gujar, A., Harisson, B. L., Moran, T. P. and Want, R. (2000). Embodied user interfaces for really direct manipulation. Communications of the ACM, 43(9), 75-80.
  6. Furnas, G.W. (1991). New graphical reasoning models for understanding graphical interfaces. In Proceedings of ACM CHI'91 Conference (New Orleans, LA), 71-78. New York, NY: ACM Press.
  7. Gaver, W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 167-177.
  8. Gold, B., and Nelson, M. (1999). Speech and Audio Signal Processing: Processing and Perception of Speech and Music. New York, NY: John Wiley & Sons.
  9. Holland, S., Morse, D.R., and Gedenryd, H. (2002). Direct Combination. A new user interaction principle for mobile and ubiquitous HCI. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 108-122. Berlin: Springer.
  10. Kjeldskov, J. (2002). "Just-in-Place" information for mobile device interfaces. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 271-275. Berlin: Springer.
  11. Lorho, G., Hiipakka, J., and Marila, J. (2002). Structured menu presentation using spatial sound separation. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 419- 424. Berlin: Springer.
  12. MacKenzie, I. S., and Soukoreff, R. W. (2002). Text entry for mobile computing: Models and methods, theory and practice. Human-Computer Interaction, 17, 147- 198.
  13. MacKenzie, I. S., Kober, H., Smith, D., Jones, T., and Skepner, E. (2001). LetterWise: Prefix-based disambiguation for mobile text input. In Proceedings of UIST'01 (Orlando, FL), 111-120. New York, NY: ACM Press.
  14. McGookin, D.K., and Brewster, S.A. (2001) Fishears - The design of a multimodal focus and context system. In Proceedings of IHM-HCI'01, Vol. II (Lille, France), 1-4. Toulouse: Cépaduès-Editions.
  15. McTear, M.F. (2002). Spoken dialogue technology: enabling the conversational interface. ACM Computing Surveys, 34(1), 90 - 169.
  16. Milic-Frayling, N., and Sommerer, R. (2002). SmartView: Flexible viewing of web page contents. In Proceedings of WWW'02, (Honolulu, USA).
  17. Nigay, L., and Coutaz, J. (1993). A design space for multimodal interfaces: concurrent processing and data fusion. In Proceedings of InterCHI'93, (Amsterdam, The Netherlands), 172-178.
  18. Mäntyjärvi, J., and Seppänen, T. (2002). Adapting applications in mobile terminals using fuzzy context information. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 95- 107. Berlin: Springer.
  19. Oakley, I., Adams, A., Brewster, S.A., and Gray, P.D. Guidelines for the design of haptic widgets. In Proceedings of BCS HCI 2002 (London, UK), 195- 212. London: Springer.
  20. Öquist, G., Goldstein, M., and Björk, S. (2002). Utilizing gaze detection to stimulate the affordances of paper in the Rapid Serial Visual Presentation Format. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 378- 381. Berlin: Springer.
  21. Öquist, G., and Goldstein, M. (2003). Towards an improved readability on mobile devices: Evaluating Adaptive Rapid Serial Visual Presentation. Interacting with Computers, 15(4), 539-558.
  22. Pirhonen, A., Brewster, S.A., and Holguin, C. (2002). Gestural and audio metaphors as a means of control for mobile devices. In Proceedings of ACM CHI'02 (Minneapolis, MN), 291-298. New York, NY: ACM Press.
  23. Prechelt, L., and Typke R. (2001). An interface for melody input. ACM Transactions on Computer-Human Interaction, 8(2), 133-194.
  24. Rekimoto, J. (1987). Pick and Drop: A direct manipulation technique for multiple computer environments. In Proceedings of UIST'87, 31-39. New York, NY: ACM Press.
  25. Sazawal, V., Want, R., and Borriello, G, (2002). The Unigesture approach. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 256-270. Berlin: Springer.
  26. Schneiderman, B. (1982). The Future of Interactive Systems and the Emergence of Direct Manipulation. Behaviour and Information Technology, 1, 237-256.
  27. Sokoler, T., Nelson, L., and Pedersen, E.R. (2002). Lowresolution supplementary tactile cues for navigational assistance. In Proceedings of Mobile HCI 2002 (Pisa, Italy), 369-372. Berlin: Springer.
  28. Ward, D. J., Blackwell, A. F., and MacKay, D. J. C. (2002). Dasher: A gesture-driven data entry interface for mobile computing. Human-Computer Interaction, 17, 199-228.
  29. Williams, J. R. (1998). Guidelines for the use of multimedia in instruction. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (Chicago, IL), 1447-1451. Santa Monica, CA: HFES.
Download


Paper Citation


in Harvard Style

Öquist G. (2006). MULTIMODAL INTERACTION WITH MOBILE DEVICES - Outline of a Semiotic Framework for Theory and Practice . In Proceedings of the International Conference on Wireless Information Networks and Systems - Volume 1: WINSYS, (ICETE 2006) ISBN 978-972-8865-65-8, pages 276-283. DOI: 10.5220/0002086702760283


in Bibtex Style

@conference{winsys06,
author={Gustav Öquist},
title={MULTIMODAL INTERACTION WITH MOBILE DEVICES - Outline of a Semiotic Framework for Theory and Practice},
booktitle={Proceedings of the International Conference on Wireless Information Networks and Systems - Volume 1: WINSYS, (ICETE 2006)},
year={2006},
pages={276-283},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002086702760283},
isbn={978-972-8865-65-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Wireless Information Networks and Systems - Volume 1: WINSYS, (ICETE 2006)
TI - MULTIMODAL INTERACTION WITH MOBILE DEVICES - Outline of a Semiotic Framework for Theory and Practice
SN - 978-972-8865-65-8
AU - Öquist G.
PY - 2006
SP - 276
EP - 283
DO - 10.5220/0002086702760283