Immersive Sonification for Displaying Brain Scan Data

Agnieszka Roginska, Hariharan Mohanraj, Mark Ballora, Kent Friedman

Abstract

Scans of brains result in data that can be challenging to display due to its complexity, multi-dimensionality, and range. Visual representations of such data are limited due to the nature of the display, the number of possible dimensions that can be represented visually, and the capacity of our visual system to perceive and interpret visual data. This paper describes the use of sonification to interpret brain scans and use sound as a complementary tool to view, analyze, and diagnose. The sonification tool SoniScan is described and evaluated as a method to augment visual brain data display.

References

  1. Baier, B., Hermann, T., & Stephani, U., 2007. MultiChannel Sonification of Human EEG. Proceedings of the 13th International Conference on Auditory Display, p. 491-496.
  2. Ballora, M., Pennycook, B., Ivanov, P. C., Glass, L. & Goldberger, A. L., 2004. Heart rate sonification: A new approach to medical diagnosis. LEONARDO, 37, p. 41-46.
  3. Ballora, M., 2010. Beyond Visualization - Sonification Invited chapter in Hall, D. L. & Jordan, J. M., HumanCentered Information Fusion. Artech House Publishers.
  4. Barreto, A., Jacko, J. A., & Hugh, P. 2007. Impact of spatial auditory feedback on the efficiency of iconic human-computer interfaces under conditions of visual impairment. Computers in Human Behavior, 23(3), p. 1211-1231.
  5. Bregman, A. S., 1990. Auditory Scene Analysis. Cambridge, MA: MIT Press.
  6. Brown, M. H., 1992 An Introduction to Zeus: Audiovisualization of some elementary sequential and parallel sorting algorithms. Proceedings of the CHI 7892 Conference, p. 663-664.
  7. Cassidy, R. J., Berger, J., Lee, K., Maggioni, M., & Coifman, R. R., 2004. Auditory display of hyperspectral colon tissue images using vocal synthesis models. Proceedings of the 10th International Conference on Auditory Display.
  8. Edwards, A. D. N., Hines, G., Hunt, A. 2008. Segmentation of biolocial cell images for sonification. Proceedings of the 2008 Congress on Image and Signal Processing, Vol. 2, p. 128-32.
  9. Ferrie, C. D., Marsden, P. K., Maisey, M. N., Robinson, R. O. 1997. Visual and semiquantitative analysis of cortical FDG-PET scans in childhood epileptic encephalopathies. J Nucl Med, 38(12), p. 1891-1894.
  10. Fitch, T. & Kramer, G. Sonifying the Body Electric: Superiority of an Auditory over a Visual Display in a Complex, Multivariate System. In Kramer, G., ed., 1994. Auditory Display: Sonification, Audification, and Auditory Interfaces. Santa Fe Institute Studies in the Sciences of Complexity. Addison Wesley. Reading, MA.
  11. Hart, M., & Smoot, G. S., 2012. Rhythms of the Universe, a multi-media production. Lawrence Berkeley Labs.
  12. Hart, M., et. al., 2012. The Mickey Hart Band: Mysterium Tremendum. 360° Productions, Inc., Sepastopol, CA.
  13. Hermann, T., Hunt, A., & Neuhoff, J. G., 2011. The Sonification Handbook. Logos Publishing House, Berlin.
  14. Hermann, T., Meinicke, P., Bekel, H., Ritter, H, Muller, H.M., Weiss, S. 2002. Sonifications for EEG data analysis. Proceedings of the 8th International Conference on Auditory Display.
  15. Hong, S. L., 2007. Entropy Compensation in Human Motor Adaptation.Ph.D thesis in Kinesiology, Penn State University.
  16. Jovanov, E., Starcevic, D., Wegner, K., Karron, D., & Radivojevic, V., 1998. Acoustic rendering as support for sustained attention during biomedical procedures. Proceedings of the 5th International Conference on Auditory Display, p. 1-4.
  17. Kramer, G., ed., 1994. Auditory Display: Sonification, Audification, and Auditory Interfaces. Santa Fe Institute Studies in the Sciences of Complexity, Addison Wesley. Reading, MA.
  18. Kramer, G., Walker, B., Bonebright, T., Cook, P., Flowers, J., Miner, N.; Neuhoff, J., Bargar, R., Barrass, S., Berger, J., Evreinov, G., Fitch, W., Gröhn, M., Handel, S., Kaper, H., Levkowitz, H., Lodha, S., Shinn-Cunningham, B., Simoni, M., Tipei, S., 1999. The Sonification Report: Status of the Field and Research Agenda. Report prepared for the NSF by members of the International Community for Auditory Display. ICAD, Santa Fe, NM.
  19. Kono, A.K., Ishii, K., Sofue, K., Miyamoto, N., Sakamoto, S., Mori, E. 2007. Fully automatic differential diagnosis system for dementia with Lewy bodies and Alzheimer's disease using FDG-PET and 3D-SSP. Eur J Nucl Med Mol Imaging 34, p. 1490-1497.
  20. Marston, J.R., Loomis, J. M., Klatzky, R. L., Golledge, R. G., & Smith, E. L., 2006. Evaluation of Spatial Displays for Navigation Without Sight. ACM Transactions on Applied Perception, 3(2), p. 110-124.
  21. Mauney, L. M., & Walker, B. N., 2007. Individual Differences and the Field of Auditory Display: Past Research, a Present Study, and an Agenda for the Future. Proceedings of the 13th International Conference on Auditory Display, p. 386-90.
  22. Minoshima, S., Frey, K.A., Koeppe, R.A., Foster, N.L., Kuhl, D.E. 1995. A diagnostic approach in alzheimer's disease using three-dimensional stereotactic surface projections of fluorine-18-FDG PET. J Nucl Med 36(7), p. 1238-48.
  23. Müller-Tomefelde, C., 2004. Interaction sound feedback in a haptic virtual environment to improve motor skill acquisition. Proceedings of the 10th International Conference on Auditory Display, p. 1-4.
  24. Najafi, M., Soltanian-Zadeh, H., Jafari-Khouzani, K., Scarpace, L. Mikkelsen, T. 2012. Prediction of glioblastoma multiform response to bevacizumab treatment using multi-parametric MRI. Plos One 7(1), p. 1-7.
  25. Oh J, Wang Y, Apte A, Deasy J. 2012. SU-E-T-259: A Statistical and Machine Learning-Based Tool for Modeling and Visualization of Radiotherapy Treatment Outcomes. Med Phys. 39(6), p. 3763.
  26. Pauletto, S., Hunt, A. 2006. The sonification of EMG data. Proceedings of the 12th International Conference on Auditory Display. p. 152-157.
  27. Rayleigh, Lord [Strutt, J.W.], 1907. On our perception of sound direction. Philosophical Magazine. 13, p. 214:232.
  28. Shilling, R.D., Letowski, T., & Storms R., 2000. Spatial Auditory Displays for Use within Attack Rotary Wing Aircraft, P Proceedings of the 6th International Conference on Auditory Display, Atlanta, GA.
  29. Wakefield, G.H., Roginska, A. & Santoro, T.S. 2012. Auditory detection of infrapitch signals for several spatial configurations of pink noise maskers, Proceedings of the 41st International Congress on Noise Control Engineering, Inter-Noise 2012, New York, NY.
  30. Walker, B. N., 2002. Magnitude Estimation of Conceptual Data Dimensions for Use in Sonification. Journal of Experimental Psychology: Applied, 8(4): p. 211-221.
  31. Wallis, I., Ingalls, T., Rikakis, T., Olsen, L., Chen, Y., Xu, W & Sundaram, H., 2007. Real-Time Sonification of Movement for an Immersive Stroke Rehabilitation Environment. Proceedings of the 13th International Conference on Auditory Display, pp. 497-503.
  32. Woodworth, R. S. 1954. Experimental Psychology, revised edition. New York: Rinehart & Winston.
Download


Paper Citation


in Harvard Style

Roginska A., Mohanraj H., Ballora M. and Friedman K. (2013). Immersive Sonification for Displaying Brain Scan Data . In Proceedings of the International Conference on Health Informatics - Volume 1: HEALTHINF, (BIOSTEC 2013) ISBN 978-989-8565-37-2, pages 24-33. DOI: 10.5220/0004202900240033


in Bibtex Style

@conference{healthinf13,
author={Agnieszka Roginska and Hariharan Mohanraj and Mark Ballora and Kent Friedman},
title={Immersive Sonification for Displaying Brain Scan Data},
booktitle={Proceedings of the International Conference on Health Informatics - Volume 1: HEALTHINF, (BIOSTEC 2013)},
year={2013},
pages={24-33},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004202900240033},
isbn={978-989-8565-37-2},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Health Informatics - Volume 1: HEALTHINF, (BIOSTEC 2013)
TI - Immersive Sonification for Displaying Brain Scan Data
SN - 978-989-8565-37-2
AU - Roginska A.
AU - Mohanraj H.
AU - Ballora M.
AU - Friedman K.
PY - 2013
SP - 24
EP - 33
DO - 10.5220/0004202900240033