Emotion Recognition through Body Language using RGB-D Sensor
Lilita Kiforenko, Dirk Kraft
2016
Abstract
This paper presents results on automatic non-acted human emotion recognition using full standing body movements and postures. The focus of this paper is to show that it is possible to classify emotions using a consumer depth sensor in an everyday scenario. The features for classification are body joint rotation angles and meta-features that are fed into a Support Vector Machines classifier. The work of Gaber-Barron and Si (2012) is used as inspiration and many of their proposed meta-features are reimplemented or modified. In this work we try to identify ”basic” human emotions, that are triggered by various visual stimuli. We present the emotion dataset that is recorded using Microsoft Kinect for Windows sensor and body joints rotation angles that are extracted using Microsoft Kinect Software Development Kit 1.6. The classified emotions are curiosity, confusion, joy, boredom and disgust. We show that human real emotions can be classified using body movements and postures with a classification accuracy of 55.62%.
References
- Azcarate, A., Hageloh, F., S, K. V. D., and Valenti, R. (2005). Automatic facial emotion recognition.
- Camras, L., Sullivan, J., and Michel, G. (1993). Do infants express discrete emotions? Adult judgements of facial, vocal, and body actions. Journal of Nonverbal Behavior, 17:171-186.
- Catuhe, D. (2013). Kinect toolbox. [Online; accessed 04- April-2013].
- Darwin, C. (1872). Expression of the emotions in man and animals. John Murray.
- DMello, S. and Graesser, A. (2009). Automatic detection of learners affect from gross body language. Applied Artificial Intelligence , 23:123-150.
- Ekman, P. (1992). An argument for basic emotions. Cognition & Emotion, 6(3-4):169-200.
- Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48:384-392.
- Ekman, P. and Friesen, W. (1974). Detecting deception from the body or face. Journal of Personality and Social Psychology, 29:288-298.
- Gaber-Barron, M. and Si, M. (2012). Using body movement and posture for emotion detection in non-acted scenarios. Fuzzy Systems (FUZZ-IEEE), 2012 IEEE International Conference on 10-15 June 2012, pages 1-8.
- Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., and Scherer, K. (2011). Toward a minimal representation of affective gestures. IEEE Transactions on affective computing, 2(2):106-116.
- Gunes, H. and Piccardi, M. (2005). Affect recognition from face and body: Early fusion versus late fusion. IEEE International Conference of Systems, Man, and Cybernetics, pages 3437-3443.
- Kapoor, A., Picard, R., and Ivanov, Y. (2004). Probabilistic combination of multiple modalities to detect interest. International Conference of Pattern Recognition, 3:969-972.
- Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., and Driessen, P. F. (2005). Gesture-based affective computing on motion capture data. In Affective Computing and Intelligent Interaction, pages 1-7. Springer.
- Keltner, D. and Haidt, J. (1999). Social functions of emotions at four levels of analysis. Cognition & Emotion, 13(5):505-521.
- Kleinsmith, A., Bianchi-Berthouze, N., and Steed, A. (2011). Automatic recognition of non-acted affective postures. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 41(4):1027-1038.
- Lee, D., Yun, W. h., Park, C. k., Yoon, H., Kim, J., and Park, C. H. (2015). Measuring the engagement level of children for multiple intelligence test using kinect. volume 9445, pages 944529-944529-5.
- Oblak, D. (2013). Ndtw. [Online; accessed 30-May-2013].
- Ortony, A. and Turner, T. (1990). What's basic about basic emotions? Psychological Review, 97(3):315-331.
- Ratanamahatana, C. and Keogh, E. (April 21-23, 2005). Three myths about dynamic time warping data mining. SIAM International Conference on Data Mining, Newport Beach, CA.
- Sanghvi, J., Castellano, G., Leite, I., Pereira, A., McOwan, P., and Paiva, A. (2011). Automatic analysis of affective postures and body motion to detect engagement with a game companion. In Billard, A., Jr., P. H. K., Adams, J. A., and Trafton, J. G., editors, Proceedings of the 6th International Conference on Human Robot Interaction, HRI 2011, Lausanne, Switzerland, March 6-9, 2011, pages 305-312. ACM.
- Schaaff, K. and Schultz, T. (ACII 2009. 3rd International Conference on 10-12 Sept. 2009). Towards emotion recognition from electroencephalographic signals. Affective Computing and Intelligent Interaction and Workshops.
- Scherer, K. and Wallbott, H. (1990). Mimik im KontextDie Bedeutung verschiedener Informationskomponenten fr das Erkennen von Emotionen. Hogrefe.
- Singh, G., Jati, A., Khasnobish, A., Bhattacharyya, S., Konar, A., Tibarewala, D., and Janarthanan, R. (Third International Conference on 26-28 July 2012). Negative emotion recognition from stimulated eeg signals. Computing Communication & Networking Technologies (ICCCNT).
- Wallbott, H. (1998). Bodily expression of emotion. European Journal of Social Psychology, 28:879-896.
- Witten, I. H., Frank, E., and Hall, M. A. (2011). Data Mining: Practical Machine Learning Tools and Techniques. Elsevier, 3rd edition.
Paper Citation
in Harvard Style
Kiforenko L. and Kraft D. (2016). Emotion Recognition through Body Language using RGB-D Sensor . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 398-405. DOI: 10.5220/0005783403980405
in Bibtex Style
@conference{visapp16,
author={Lilita Kiforenko and Dirk Kraft},
title={Emotion Recognition through Body Language using RGB-D Sensor},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)},
year={2016},
pages={398-405},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005783403980405},
isbn={978-989-758-175-5},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)
TI - Emotion Recognition through Body Language using RGB-D Sensor
SN - 978-989-758-175-5
AU - Kiforenko L.
AU - Kraft D.
PY - 2016
SP - 398
EP - 405
DO - 10.5220/0005783403980405