Acknowledgements. The work described in this paper is funded by the EU under
research grants CALLAS (IST-34800), IRIS (Reference: 231824) and Metabo (Refer-
ence: 216270).
References
1. Bailenson, J.; Pontikakis, E.; Mauss, I.; Gross, J.; Jabon, M.; Hutcherson, C.; Nass, C.; John,
C.: Real-time classification of evoked emotions using facial feature tracking and physiologi-
cal responses. Int’l Journal of Human-Computer Studies, 66(5):303-317, 2008.
2. Caridakis, G.; Raouzaiou, A.; Karpouzis, K.; Kollias, S.: Synthesizing gesture expressivity
based on real sequences. In Proc. of LREC Workshop on multimodal corpora: from multi-
modal behaviour theories to usable models, 2006.
3. Charles, F.; Pizzi, D.; Cavazza, M.; Vogt, T.; Andr
´
e, E.: Emotional input for character-based
interactive storytelling. In The 8th International Conference on Autonomous Agents and Mul-
tiagent Systems (AAMAS), Budapest, Hungary, 2009.
4. Gilroy, S. W.; Cavazza, M.; Chaignon, R.; M
¨
akel
¨
a, S.-M.; Niranen, M.; Andr
´
e, E.; Vogt,
T.; Urbain, J.; Billinghurst, M.; Seichter, H.; Benayoun, M.: E-tree: emotionally driven aug-
mented reality art. In Proc. ACM Multimedia, pages 945-948, Vancouver, BC, Canada, 2008.
ACM.
5. H
¨
onig, F.; Wagner, J.; Batliner, A.; N
¨
oth, E.: Classification of user states with physiological
signals: On-line generic features vs. specialized feature sets. In Proc. of the 17th European
Signal Processing Conference (EUSIPCO-2009), 2009.
6. Jacucci, G. G.; Spagnolli, A.; Chalambalakis, A.; Morrison, A.; Liikkanen, L.; Roveda, S.;
Bertoncini. M.: Bodily explorations in space: Social experience of a multimodal art instal-
lation. In Proc. of the twelfth IFIP conference on Human-Computer Interaction: Interact,
2009.
7. Kim, J.; Andr
´
e, E.: Emotion recognition based on physiological changes in music listening.
IEEE Trans. Pattern Anal. and Machine Intell., 30(12):2067-2083, 2008.
8. Kim, J.; Ragnoni, A.; Biancat, J.: In-Vehicle Monitoring Of Affective Symptoms for Diabetic
Drivers. In Proc. of the Int. Conf. on Health Informatics (HEALTHINF), 2010. [in press]
9. Vogt, T.; Andr
´
e, E.; Bee, N.: A framework for online recognition of emotions from voice.
In Proc. of Workshop on Perception and Interactive Technologies for Speech-Based Systems,
Kloster Irsee, Germany, 2008.
10. Wagner, J.; Kim, J.; Andr
´
e, E.: From Physiological Signals to Emotions: Implementing and
Comparing Selected Methods for Feature Extraction and Classification. In Proc. IEEE ICME
2005, pp. 940-943, Amsterdam, 2005.
11. Wagner, J.; Andr
´
e; E., Jung, F.: Smart sensor integration: A framework for multimodal emo-
tion recognition in real-time. In Affective Computing and Intelligent Interaction (ACII 2009),
2009.
12. Zeng, Z.; Pantic, M.; Roisman, G I.; Huang, T S.: A survey of affect recognition meth-
ods: Audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell.,
31(1):39-58, 2009.
21