Feature Extraction for Human Motion Indexing of Acted Dance Performances

Andreas Aristidou, Yiorgos Chrysanthou

2014

Abstract

There has been an increasing use of pre-recorded motion capture data for animating virtual characters and synthesising different actions; it is although a necessity to establish a resultful method for indexing, classifying and retrieving motion. In this paper, we propose a method that can automatically extract motion qualities from dance performances, in terms of Laban Movement Analysis (LMA), for motion analysis and indexing purposes. The main objectives of this study is to analyse the motion information of different dance performances, using the LMA components, and extract those features that are indicative of certain emotions or actions. LMA encodes motions using four components, Body, Effort, Shape and Space, which represent a wide array of structural, geometric, and dynamic features of human motion. A deeper analysis of how these features change on different movements is presented, investigating the correlations between the performers' acting emotional state and its characteristics, thus indicating the importance and the effect of each feature for the classification of the motion. Understanding the quality of the movement helps to apprehend the intentions of the performer, providing a representative search space for indexing motions.

References

  1. Alaoui, S. F., Jacquemin, C., and Bevilacqua, F. (2013). Chiseling bodies: an augmented dance performance. In Proceedings of ACM SIGCHI Conference on Human Factors in Computing Systems, Paris, France. ACM.
  2. Arikan, O., Forsyth, D. A., and O'Brien, J. F. (2003). Motion synthesis from annotations. ACM Trans. of Graphics, 22(3):402-408.
  3. Barbic?, J., Safonova, A., Pan, J.-Y., Faloutsos, C., Hodgins, J. K., and Pollard, N. S. (2004). Segmenting motion capture data into distinct behaviors. In Proceedings of Graphics Interface, GI 7804, pages 185-194.
  4. Chan, J. C. P., Leung, H., Tang, J. K. T., and Komura, T. (2011). A virtual reality dance training system using motion capture technology. IEEE Trans. on Learning Technologies, 4(2):187-195.
  5. Chao, M.-W., Lin, C.-H., Assa, J., and Lee, T.-Y. (2012). Human motion retrieval from hand-drawn sketch. IEEE Trans. on Visualization and Computer Graphics, 18(5):729-740.
  6. Chi, D., Costa, M., Zhao, L., and Badler, N. (2000). The emote model for effort and shape. In Proceedings of SIGGRAPH 7800, pages 173-182, NY, USA. ACM.
  7. Cimen, G., Ilhan, H., Capin, T., and Gurcay, H. (2013). Classification of human motion based on affective state descriptors. Computer Animation and Virtual Worlds, 24(3-4):355-363.
  8. CMU (2003). Carnegie Mellon Univiversity: Database. http://mocap.cs.cmu.edu/.
  9. Deng, Z., Gu, Q., and Li, Q. (2009). Perceptually consistent example-based human motion retrieval. In Proceedings of I3D 7809, pages 191-198, NY, USA. ACM.
  10. Fang, A. C. and Pollard, N. S. (2003). Efficient synthesis of physically valid human motion. ACM Trans. of Graphics, 22(3):417-426.
  11. Gleicher, M. (1998). Retargetting motion to new characters. In Proceedings of SIGGRAPH 7898, pages 33-42, NY, USA. ACM.
  12. Hartmann, B., Mancini, M., and Pelachaud, C. (2006). Implementing expressive gesture synthesis for embodied conversational agents. In Proceedings of GW'05, pages 188-199. Springer-Verlag.
  13. Hecker, C., Raabe, B., Enslow, R. W., DeWeese, J., Maynard, J., and van Prooijen, K. (2008). Real-time motion retargeting to highly varied user-created morphologies. ACM Trans. of Graphcis, 27(3):1-27.
  14. Ikemoto, L. and Forsyth, D. A. (2004). Enriching a motion collection by transplanting limbs. In Proceedings of SCA 7804, pages 99-108, Switzerland.
  15. Kapadia, M., Chiang, I.-k., Thomas, T., Badler, N. I., and Kider, Jr., J. T. (2013). Efficient motion retrieval in large motion databases. In Proceedings of I3D 7813, pages 19-28, NY, USA. ACM.
  16. Keogh, E., Palpanas, T., Zordan, V. B., Gunopulos, D., and Cardle, M. (2004). Indexing large human-motion databases. In Proceedings of VLDB, pages 780-791.
  17. Kovar, L. and Gleicher, M. (2004). Automated extraction and parameterization of motions in large data sets. ACM Trans. of Graphics, 23(3):559-568.
  18. Kovar, L., Gleicher, M., and Pighin, F. (2002). Motion graphs. ACM Trans. of Graphics, 21(3):473-482.
  19. Krüger, B., Tautges, J., Weber, A., and Zinke, A. (2010). Fast local and global similarity searches in large motion capture databases. In Proceedings of SCA 7810, pages 1-10, Switzerland. Eurographics Association.
  20. Kwon, T., Cho, Y.-S., Park, S. I., and Shin, S. Y. (2008). Two-character motion analysis and synthesis. IEEE Trans. on Visualization and Computer Graphics, 14(3):707-720.
  21. Lamb, W. (1965). Posture & gesture: an introduction to the study of physical behaviour. G. Duckworth, London.
  22. Liu, G., Zhang, J., Wang, W., and McMillan, L. (2005). A system for analyzing and indexing human-motion databases. In SIGMOD 7805, pages 924-926.
  23. Luo, P. and Neff, M. (2012). A perceptual study of the relationship between posture and gesture for virtual characters. In Motion in Games, pages 254-265.
  24. Maletic, V. (1987). Body, Space, Expression: The Edevelopment of Rudolf Laban's Movement and Dance Concepts. Approaches to semiotics. De Gruyter Mouton.
  25. Min, J., Liu, H., and Chai, J. (2010). Synthesis and editing of personalized stylistic human motion. In Proceedings of I3D'10, pages 39-46, NY, USA. ACM.
  26. Moore, C.-L. and Yamamoto, K. (1988). Beyond Words: Movement Observation and Analysis. Number v. 2. Gordon and Breach Science Publishers.
  27. Müller, M., Röder, T., and Clausen, M. (2005). Efficient content-based retrieval of motion capture data. ACM Trans. of Graphics, 24(3):677-685.
  28. Nann Winter, D., Widell, C., Truitt, G., and George-Falvy, J. (1989). Empirical studies of posture-gesture mergers. Journal of Nonverbal Behavior, 13(4):207-223.
  29. Okajima, S., Wakayama, Y., and Okada, Y. (2012). Human motion retrieval system based on LMA features using interactive evolutionary computation method. In Innov. in Intelligent Machines, pages 117-130.
  30. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39:1161-1178.
  31. Shapiro, A., Cao, Y., and Faloutsos, P. (2006). Style components. In Proceedings of GI'06, pages 33-39, Canada.
  32. Torresani, L., Hackney, P., and Bregler, C. (2006). Learning motion style synthesis from perceptual observations. In Proceedings of NIPS'06, pages 1393-1400.
  33. Troje, N. F. (2009). Decomposing biological motion: A framework for analysis and synthesis of motion gait patterns. Journal of Motion, 2(5):371-387.
  34. UCY (2012). Univiversity of Cyprus: Database. http://dancedb.cs.ucy.ac.cy/.
  35. UTA (2011). Univiversity of Texas-Arlington: Human Motion Database. http://smile.uta.edu/hmd/.
  36. Wakayama, Y., Okajima, S., Takano, S., and Okada, Y. (2010). IEC-based motion retrieval system using laban movement analysis. In Proceedings of KES'10, pages 251-260. Springer-Verlag.
  37. Wu, S., Wang, Z., and Xia, S. (2009). Indexing and retrieval of human motion data by a hierarchical tree. In Proceedings of VRST, pages 207-214, NY, USA. ACM.
  38. Zhao, L. and Badler, N. I. (2005). Acquiring and validating motion qualities from live limb gestures. Graphical Models, 67(1):1-16.
  39. Zhao, L. and Safonova, A. (2009). connectivity in motion graphs. 71(4):139-152.
Download


Paper Citation


in Harvard Style

Aristidou A. and Chrysanthou Y. (2014). Feature Extraction for Human Motion Indexing of Acted Dance Performances . In Proceedings of the 9th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2014) ISBN 978-989-758-002-4, pages 277-287. DOI: 10.5220/0004662502770287


in Bibtex Style

@conference{grapp14,
author={Andreas Aristidou and Yiorgos Chrysanthou},
title={Feature Extraction for Human Motion Indexing of Acted Dance Performances},
booktitle={Proceedings of the 9th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2014)},
year={2014},
pages={277-287},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004662502770287},
isbn={978-989-758-002-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 9th International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2014)
TI - Feature Extraction for Human Motion Indexing of Acted Dance Performances
SN - 978-989-758-002-4
AU - Aristidou A.
AU - Chrysanthou Y.
PY - 2014
SP - 277
EP - 287
DO - 10.5220/0004662502770287