MUVTIME: A Multivariate Time Series Visualizer for Behavioral Science

Emanuel Sousa, Tiago Malheiro, Estela Bicho, Wolfram Erlhagen, Jorge Santos, Alfredo Pereira

2016

Abstract

As behavioral science becomes progressively more data driven, the need is increasing for appropriate tools for visual exploration and analysis of large datasets, often formed by multivariate time series. This paper describes MUVTIME, a multimodal time series visualization tool, developed in Matlab that allows a user to load a time series collection (a multivariate time series dataset) and an associated video. The user can plot several time series on MUVTIME and use one of them to do brushing on the displayed data, i.e. select a time range dynamically and have it updated on the display. The tool also features a categorical visualization of two binary time series that works as a high-level descriptor of the coordination between two interacting partners. The paper reports the successful use of MUVTIME under the scope of project TURNTAKE, which was intended to contribute to the improvement of human-robot interaction systems by studying turn-taking dynamics (role interchange) in parent-child dyads during joint action.

References

  1. Aigner W., Miksch S., Schumann H., et al. (2011) Visualization of Time-Oriented Data. HumanComputer Interaction: 69-103.
  2. Bade R., Schlechtweg S. and Miksch S. (2004) Connecting time-oriented data and information to a coherent interactive visualization. Proceedings of the 2004 conference on Human factors in computing systems CHI 04 6: 105-112.
  3. Bakeman R and Gottman JM. (1997) Observing interaction: An introduction to sequential analysis: Cambridge university press.
  4. Beard K., Deese H. and Pettigrew N. R. (2007) A framework for visualization and exploration of events. Information Visualization 7: 133-151.
  5. Brodbeck D., Gasser R. and Degen M. (2005) Enabling large-scale telemedical disease management through interactive visualization. European Notes in Medical Informatics 1: 1172-1177.
  6. Buono P., Plaisant C., Simeone A., et al. (2007) Similarity-Based Forecasting with Simultaneous Previews: A River Plot Interface for Time Series Forecasting. 2007 11th International Conference Information Visualization (IV 7807). IEEE, 191-196.
  7. Burgoon J. K., Stern L. A. and Dillman L. (2007) Interpersonal adaptation: Dyadic interaction patterns: Cambridge University Press.
  8. Cassotta L., Feldstein S. and Jaffe J. (1964) AVTA: a Device for Automatic Vocal Transation Analysis. Journal of the experimental analysis of behavior 7: 99-104.
  9. Catarci T., Santucci G. and Silva S. F. (2003) An Interactive Visual Exploration of Medical Data for Evaluating Health Centres. Journal of research and practice in information technology 35: 99-119.
  10. Dachselt R., Frisch M. and Weiland M. (2008) FacetZoom: A Continuous Multi-Scale Widget for Navigating Hierarchical Metadata. In: ACM (ed) Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Florence, Italy: ACM Press, 1353.
  11. De Barbaro K., Johnson C. M., Forster D., et al. (2013) Methodological considerations for investigating the microdynamics of social interaction development. IEEE Transactions on Autonomous Mental Development 5: 258-270.
  12. Fails J. A., Karlson A., Shahamat L., et al. (2006) A visual interface for multivariate temporal data: Finding patterns of events across multiple histories. IEEE Symposium on Visual Analytics Science and Technology 2006, VAST 2006 - Proceedings: 167-174.
  13. Few S., (2008) Time on the Horizon. Visual Business Intelligence Newsletter: 1-7.
  14. Fouse A., Weibel N., Hutchins E., et al. (2011) ChronoViz: a system for supporting navigation of time-coded data. CHI'11 Extended Abstracts on Human Factors in Computing Systems. ACM, 299- 304.
  15. Gorgolewski K. J., Varoquaux G., Rivera G., et al. (2015) NeuroVault. org: A web-based repository for collecting and sharing unthresholded statistical maps of the human brain. Frontiers in neuroinformatics 9.
  16. Gray J., Breazeal C., Berlin M., et al. (2005) Action parsing and goal inference using self as simulator. Robot and Human Interactive Communication, 2005. ROMAN 2005. IEEE International Workshop on. IEEE, 202-209.
  17. Gschwandtner T., Aigner W., Kaiser K., et al. (2011) CareCruiser: Exploring and visualizing plans, events, and effects interactively. IEEE Pacific Visualization Symposium 2011, PacificVis 2011 - Proceedings: 43- 50.
  18. Guo D., Chen J., MacEachren A. M., et al. (2006) A Visualization System for Space-Time and Multivariate Patterns (VIS-STAMP). IEEE Transactions on Visualization and Computer Graphics 12: 1461-1474.
  19. Heer J., Kong N and Agrawala M. (2009) Sizing the horizon: the effects of chart size and layering on the graphical perception of time series visualizations. CHI 7809Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, April 4-9, 2009: 1303- 1312.
  20. Hochheiser H. and Shneiderman B. (2004) Dynamic query tools for time series data sets: Timebox widgets for interactive exploration. Information Visualization 3: 1- 18.
  21. Jaffe E., (2014) What Big Data Means For Psychological Science. APS Observer.
  22. Jaffe J., Beebe B., Feldstein S., et al. (2001) Rhythms of dialogue in infancy: Coordinated timing in development. Monographs of the society for research in child development.
  23. Johansson G. (1973) Visual perception of biological motion and a model for its analysis. Attention, Perception, & Psychophysics 14: 201-211.
  24. Johnson K. and Shiffrar M. (2013) People watching: Social, perceptual, and neurophysiological studies of body perception: Oxford University Press.
  25. Kapler T. and Wright W. (2005) GeoTime information visualization. Information Visualization 4: 136-146.
  26. Kim J. G., Snodgrass M., Pietrowicz M., et al. (2013) Visual Analysis of Relationships between Behavioral and Physiological Sensor Data. Workshop on Visual Analytics in Healthcare. Washington, USA.
  27. Kipp M. (2012) Multimedia Annotation, Querying, and Analysis in Anvil. Multimedia Information Extraction: Advances in Video, Audio, and Imagery Analysis for Search, Data Mining, Surveillance, and Authoring: 351-367.
  28. Kipp M., Martin J-C, Paggio P, et al. (2009) Multimodal corpora: from models of natural interaction to systems and applications: Springer.
  29. Lee J., Lin S and Karahalios K. (2013) Visualizing Patterns of Social and Communicative Behavior in Children Using Plexlines. Workshop on Visual Analytics in Healthcare.
  30. Lisboa I. C., Sousa E., Santos J. A., et al. (2014) Parentchild vocal coordination during joint construction tasks: a replication of the AVTA model of turn-taking. 9th Meeting of the Portuguese Association of Experimental Psychology APPE 2014. Covilhã, Portugal.
  31. Milde J-T, Milde J-T, Gut U., et al. (2001) The TASXenvironment: an XML-based corpus database for time aligned language data. Proceedings of the IRCS Workshop On Linguistic Databases, 11-13 December 2001: 174-180.
  32. Pereira A. F., Lisboa I. C., Sousa E., et al. (2014a) Vocal and motor coordination in joint-construction tasks during parent-child social interactions. Workshop Dynamic Interactions Between Visual Experiences, Actions and Word Learning, part of the Fourth Joint Conference on Development and Learning and on Epigenetic Robotics. Genoa, Italy.
  33. Pereira A. F., Smith L. B. and Yu C. (2014b) A bottom-up view of toddler word learning. Psychonomic bulletin & review 21: 178-185.
  34. Rind A., Aigner W, Miksch S, et al. (2011) Visual Exploration of Time-Oriented Patient Data for Chronic Diseases: Design Study and Evaluation. Information Quality in e-Health. Lecture Notes in Computer Science. 301-320.
  35. Rolf M., Hanheide M and Rohlfing KJ. (2009) Attention via synchrony: Making use of multimodal cues in social learning. Autonomous Mental Development, IEEE Transactions on 1: 55-67.
  36. Roy B. C., Frank M. C., DeCamp P., et al. (2015) Predicting the birth of a spoken word. Proceedings of the National Academy of Sciences: 201419773.
  37. Saito T., Miyamura H. N., Yamamoto M., et al. (2005) Two-tone pseudo coloring: Compact visualization for one-dimensional data. Proceedings - IEEE Symposium on Information Visualization, INFO VIS: 173-180.
  38. Sanderson P., Scott J., Johnston T., et al. (1994) MacSHAPA and the enterprise of exploratory sequential data analysis (ESDA). International Journal of Human-Computer Studies. 633-681.
  39. Schanda J. (2007) CIE Colorimetry. Colorimetry. Hoboken, NJ, USA: John Wiley & Sons, Inc., 25-78.
  40. Schmidt T. and Wörner K. (2009) EXMARaLDACreating, analyzing and sharing spoken language corpora for pragmatics research. PragmaticsQuarterly Publication of the International Pragmatics Association 19: 565.
  41. Shahar Y., Goren-Bar D., Boaz D., et al. (2006) Distributed, intelligent, interactive visualization and exploration of time-oriented clinical data and their abstractions. Artificial Intelligence in Medicine 38: 115-135.
  42. Smith L. B., Yu C. and Pereira A. F. (2011) Not your mother's view: the dynamics of toddler visual experience. Developmental Science 14: 9-17.
  43. Swan M. (2013) The quantified self: Fundamental disruption in big data science and biological discovery. Big Data 1: 85-99.
  44. Vicon. (2015) Nexus 2. Available at: http://www.vicon.com/products/software/nexus.
  45. Wang T. D, Plaisant C, Shneiderman B, et al. (2009) Temporal summaries: Supporting temporal categorical searching, aggregation and comparison. IEEE Transactions on Visualization and Computer Graphics 15: 1049-1056.
  46. Wittenburg P., Brugman H., Russel A., et al. (2006) ELAN: a professional framework for multimodality research. Proceedings of LREC. 5th.
  47. Wongsuphasawat K. and Shneiderman B. (2009) Finding comparable temporal categorical records: A similarity measure with an interactive visualization. VAST 09 - IEEE Symposium on Visual Analytics Science and Technology, Proceedings: 27-34.
  48. Yarkoni T. (2012) Psychoinformatics new horizons at the interface of the psychological and computing sciences. Current Directions in Psychological Science 21: 391- 397.
  49. Young F. W. and Bann C. M. (1996) ViSta: The visual statistics system. Technical Report 94-1 (c).
  50. Yu C and Smith L. B. (2013) Joint Attention without Gaze Following: Human Infants and Their Parents Coordinate Visual Attention to Objects through EyeHand Coordination. PLoS ONE 8.
  51. Yu C., Yurovsky D. and Xu T. L. (2012) Visual data mining: An exploratory approach to analyzing temporal patterns of eye movements. Infancy 17: 33- 60.
  52. Yu C., Zhong Y., Smith T., et al., (2009) Visual data mining of multimedia data for social and behavioral studies. Information Visualization 8: 56-70.
Download


Paper Citation


in Harvard Style

Sousa E., Malheiro T., Bicho E., Erlhagen W., Santos J. and Pereira A. (2016). MUVTIME: A Multivariate Time Series Visualizer for Behavioral Science . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: IVAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 165-176. DOI: 10.5220/0005725301650176


in Bibtex Style

@conference{ivapp16,
author={Emanuel Sousa and Tiago Malheiro and Estela Bicho and Wolfram Erlhagen and Jorge Santos and Alfredo Pereira},
title={MUVTIME: A Multivariate Time Series Visualizer for Behavioral Science},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: IVAPP, (VISIGRAPP 2016)},
year={2016},
pages={165-176},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005725301650176},
isbn={978-989-758-175-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: IVAPP, (VISIGRAPP 2016)
TI - MUVTIME: A Multivariate Time Series Visualizer for Behavioral Science
SN - 978-989-758-175-5
AU - Sousa E.
AU - Malheiro T.
AU - Bicho E.
AU - Erlhagen W.
AU - Santos J.
AU - Pereira A.
PY - 2016
SP - 165
EP - 176
DO - 10.5220/0005725301650176