Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction
Markus Kächele, Stefanie Rukavina, Günther Palm, Friedhelm Schwenker, Martin Schels
2015
Abstract
A major building block for the construction of reliable statistical classifiers in the context of affective human-computer interaction is the collection of training samples that appropriately reflect the complex nature of the desired patterns. This is especially in this application a non-trivial issue as, even though it is easily agreeable that emotional patterns should be incorporated in future computer operating, it is by far not clear how it should be realized. There are still open questions such as which types of emotional patterns to consider together with their degree of helpfulness for computer interactions and the more fundamental question on what emotions do actually occur in this context. In this paper we start by reviewing existing corpora and the respective techniques for the generation of emotional contents and further try to motivate and establish approaches that enable to gather, identify and categorize patterns of human-computer interaction. %Thus we believe it is possible to gather valid and relevant data material for the affective computing community.
References
- Burkhardt, F., Paeschke, A., Rolfes, M., Sendlmeier, W. F., and Weiss, B. (2005). A database of german emotional speech. In INTERSPEECH'05, pages 1517-1520.
- Codispoti, M., Surcinelli, P., and Baldaro, B. (2008). Watching emotional movies: Affective reactions and gender differences. International Journal of Psychophysiology, 69(2):90-95.
- Daly, I., Malik, A., Hwang, F., Roesch, E., Weaver, J., Kirke, A., Williams, D., Miranda, E., and Nasuto, S. J. (2014). Neural correlates of emotional responses to music: An EEG study. Neuroscience Letters, 573(0):52-57.
- Ekman, P., Sorenson, E. R., and Friesen, W. V. (1969). Pancultural elements in facial displays of emotion. Science, 164(3875):86-88.
- Glodek, M., Reuter, S., Schels, M., Dietmayer, K., and Schwenker, F. (2013). Kalman filter based classifier fusion for affective state recognition. In MCS, volume 7872 of LNCS, pages 85-94.
- Hamann, S. (2012). Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends in Cognitive Sciences, 16(9):458-466.
- Healey, J. A. (2000). Wearable and automotive systems for affect recognition from physiology. PhD thesis, MIT.
- Hewig, J., Hagemann, D., Seifert, J., Gollwitzer, M., Naumann, E., and Bartussek, D. (2005). A revised film set for the induction of basic emotions. Cognition & Emotion, 19(7):1095-1109.
- Kächele, M., Glodek, M., Zharkov, D., Meudt, S., and Schwenker, F. (2014). Fusion of audio-visual features using hierarchical classifier systems for the recognition of affective states and the state of depression. In Proceedings of ICPRAM, pages 671-678. SciTePress.
- Kächele, M., Schels, M., and Schwenker, F. (2014). Inferring depression and affect from application dependent meta knowledge. In Proceedings of the 4th International Workshop on Audio/Visual Emotion Challenge, pages 41-48. ACM. Best entry for the Affect Recognition Sub-Challenge.
- Kanade, T., Cohn, J., and Tian, Y. (2000). Comprehensive database for facial expression analysis. In Automatic Face and Gesture Recognition, 2000., pages 46-53.
- Kreibig, S. D., Wilhelm, F. H., Roth, W. T., and Gross, J. J. (2007). Cardiovascular, electrodermal, and respiratory response patterns to fear and sadness inducing films. Psychophysiology, 44(5):787-806.
- Labouvie-Vief, G., Lumley, M. A., Jain, E., and Heinze, H. (2003). Age and gender differences in cardiac reactivity and subjective emotion responses to emotional autobiographical memories. Emotion, 3(2):115-126.
- Lang, P. J., Bradley, M. M., and Cuthbert, B. N. (2005). International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-6, University of Florida, Gainesville, FL, Gainesville, FL.
- Lindquist, K. a., Siegel, E. H., Quigley, K. S., and Barrett, L. F. (2013). The hundred-year emotion war: are emotions natural kinds or psychological constructions? comment on Lench, Flores, and Bench (2011). Psychological bulletin, 139(1):255-63.
- Nater, U. M., Abbruzzese, E., Krebs, M., and Ehlert, U. (2006). Sex differences in emotional and psychophysiological responses to musical stimuli. Int J Psychophysiol, 62(2):300-308.
- Palm, G. and Glodek, M. (2013). Towards emotion recognition in human computer interaction. In Neural Nets and Surroundings, volume 19 of Smart Innovation, Systems and Technologies, pages 323-336.
- Wöllmer, M., Kaiser, M., Eyben, F., Schuller, B., and Rigoll, G. (2013). LSTM-modeling of continuous emotions in an audiovisual affect recognition framework. Image and Vision Computing, 31(2):153 - 163.
Paper Citation
in Harvard Style
Kächele M., Rukavina S., Palm G., Schwenker F. and Schels M. (2015). Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction . In Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-076-5, pages 367-373. DOI: 10.5220/0005282703670373
in Bibtex Style
@conference{icpram15,
author={Markus Kächele and Stefanie Rukavina and Günther Palm and Friedhelm Schwenker and Martin Schels},
title={Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction},
booktitle={Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2015},
pages={367-373},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005282703670373},
isbn={978-989-758-076-5},
}
in EndNote Style
TY - CONF
JO - Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction
SN - 978-989-758-076-5
AU - Kächele M.
AU - Rukavina S.
AU - Palm G.
AU - Schwenker F.
AU - Schels M.
PY - 2015
SP - 367
EP - 373
DO - 10.5220/0005282703670373