Towards Interactive Multisensory Data Representations

Susanne Tak, Lex Toet

2013

Abstract

Despite the availability of a broad range of newly developed multisensory displays and interaction techniques, multisensory interactive data representations are still not widely used. We argue that for complex information analysis multisensory data representations and multimodal interactivity are essential. By leveraging the benefits of the individual sensory modalities multisensory representations and interaction techniques can make the representation and handling of complex data more intuitive and transparent. This can make complex data analysis accessible to a wider audience, also including non-experts. However, there is currently a lack of agreed guidelines for their integrated design, as well as little empirical research in this area. We argue that there is an urgent need for further systematic research into human multisensory information processing to provide rules that enable the design and construction of representations and interfaces that achieve optimal synergistic cooperation across sensory modalities.

References

  1. Basapur, S., Bisantz, A. M., Kesavadas, T., 2003. The effect of display modality on decision-making with uncertainty. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 47(3), 558-561.
  2. Basdogan, C., Loftin, R. B., 2008. Multimodal display systems: haptic, olfactory, gustatory, and vestibular. In D. Schmorrow, J. V. Cohn, & D. Nicholson (Eds.), The PSI handbook of virtual environments for training and education: Developments for the Military and Beyond (pp. 116-134). Westport, CT, USA: Praeger Security International.
  3. Bearman, N., 2011. Using sound to represent uncertainty in future climate projections for the United Kingdom. In Proceedings of ICAD 2011.
  4. Bearman, N., Lovett, A., 2010. Using sound to represent positional accuracy of address locations. The Cartographic Journal, 47(4), 308-314.
  5. Brown, E., Bearman, N., 2012. Listening to uncertainty: Information that sings. Significance, 9(5), 14-17.
  6. Diepenbrock, S., Praßni, J.-S., Lindemann, F., Bothe, H.- W., Ropinski, T., 2011. Interactive visualization techniques for neurosurgery planning. In Proceedings of Eurographics 2011, 13-16.
  7. Edwards, A., Hunt, A., Hines, G., Jackson, V., Podvoiskis, A., Roseblade, R., Stammers, J., 2010. Sonification strategies for examination of biological cells. In Proceedings of ICAD 2010.
  8. Faeth, A., Oren, M., Harding, C., 2008. Combining 3-D geovisualization with force feedback driven user interaction. In Proceedings GIS 7808. ACM.
  9. Ferguson, S., Beilharz, K., Calò, C., 2012. Navigation of interactive sonifications and visualisations of timeseries data using multi-touch computing. Journal on Multimodal User Interfaces, 5(3) 97-109
  10. Harding, C., Kakadiaris, I. A., Casey, J.F., & Loftin, R.B., 2002. A multi-sensory system for the investigation of geoscientific data. Computers & Graphics, 26(2), 259- 269.
  11. Harding, C., Souleyrette, R. R., 2010. Investigating the use of 3D graphics, haptics (touch), and sound for highway llocation planning. Computer-Aided Civil and Infrastructure Engineering, 25(1), 20-38.
  12. Hauptmann, A. G., McAvinney, P., 1993. Gestures with speech for graphic manipulation. Int.J.Man-Machine Studies, 38(2), 231-249.
  13. Heer, J., Schneiderman, B., 2012. Interactive dynamics for visual analysis. Communications of the ACM, 55(4), 45-54.
  14. Kaber, D. B., Zhang, T., 2011. Human factors in virtual reality system design for mobility and haptic task performance. Reviews of Human Factors and Ergonomics, 7(1), 323-366.
  15. Kortum, P., 2008. HCI beyond the GUI: Design for haptic, speech, olfactory, and other non traditional interfaces. Burlington, MA: Morgan Kaufmann.
  16. Kramer, G., 1993. Auditory display: sonification, audification, and auditory interfaces. Boston, MA, USA: Addison-Wesley.
  17. Loftin, R. B., 2003. Multisensory perception: beyond the visual in visualization. Computing in Science & Engineering, 5(4), 56-58.
  18. Mancero, G., Wong, W., Amaldi, P., 2007. Looking but not seeing: implications for HCI. In Proceedings of the 14th European conference on Cognitive ergonomics., 167-174. ACM.
  19. Nesbitt, K., 2005. A framework to support the designers of haptic, visual and auditory displays. In Proceedings of GOTHI 2005, 54-64.
  20. Newcomb, M., Harding, C., 2006. A multi-modal interface for road planning tasks using vision, haptics and sound, In Proceedings of ISVC 2006, 417-426. Springer-Verlag.
  21. Ogi, T., Hirose, M., 1997. Usage of multisensory information in scientific data sensualization. Multimedia Systems, 5(2), 86-92.
  22. Oviatt, S., 1999. Ten myths of multimodal interaction. Comm. ACM, 42(9), 74-81.
  23. Paneels, S., Roberts, J.C., 2010. Review of designs for haptic data visualization. IEEE Transactions on Haptics, 3(2), 119-137.
  24. Sarter, N. B., 2006. Multimodal information presentation: Design guidance and research challenges. International Journal of Industrial Ergonomics, 36 (5), 439-445.
  25. Schmidt, G. S., Chen, S.-L., Bryden, A. N., Livingston, M.A., Osborn, B.R., Rosenblum, L.J., 2004. Multidimensional visual representations for underwater environmental uncertainty. IEEE Computer Graphics and Applications, 24(5), 56-65.
  26. Van den Berg, R. V., Cornelissen, F. W., Roerdink, J.B.T.M., 2009. A crowding model of visual clutter. Journal of Vision, 9(4).
  27. Van der Burg, E., Olivers, C. N., Bronkhorst, A. W., Theeuwes, J., 2008. Pip and pop: nonspatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception and Performance, 34(5), 1053-1065.
  28. Van der Burg, E., Olivers, C. N., Bronkhorst, A. W., Theeuwes, J., 2009. Poke and pop: tactile-visual synchrony increases visual saliency. Neuroscience Letters, 450(1), 60-64.
  29. Walker, B. N., Nees, M. A., 2005, An agenda for research and developmentf multimodal graphs (ICAD2005), In Proceedings of the 11th international conference on auditory display.
  30. Washburn, D. A., Jones, L. M., 2004. Could olfactory displays improve data visualization? Computing in Science & Engineering, 6(6), 80-83.
Download


Paper Citation


in Harvard Style

Tak S. and Toet L. (2013). Towards Interactive Multisensory Data Representations . In Proceedings of the International Conference on Computer Graphics Theory and Applications and International Conference on Information Visualization Theory and Applications - Volume 1: IVAPP, (VISIGRAPP 2013) ISBN 978-989-8565-46-4, pages 558-561. DOI: 10.5220/0004346405580561


in Bibtex Style

@conference{ivapp13,
author={Susanne Tak and Lex Toet},
title={Towards Interactive Multisensory Data Representations},
booktitle={Proceedings of the International Conference on Computer Graphics Theory and Applications and International Conference on Information Visualization Theory and Applications - Volume 1: IVAPP, (VISIGRAPP 2013)},
year={2013},
pages={558-561},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004346405580561},
isbn={978-989-8565-46-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Graphics Theory and Applications and International Conference on Information Visualization Theory and Applications - Volume 1: IVAPP, (VISIGRAPP 2013)
TI - Towards Interactive Multisensory Data Representations
SN - 978-989-8565-46-4
AU - Tak S.
AU - Toet L.
PY - 2013
SP - 558
EP - 561
DO - 10.5220/0004346405580561