human-machine interfaces for face-to-face communi-
cation and interaction.
REFERENCES
Bradley, M. M., Codispoti, M., Cuthbert, B. N., and Lang,
P. J. (2001). Emotion and motivation i: Defensive and
appetitive reactions in picture processing. Emotion,
1(3):276–298.
Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee,
C., Kazemzadeh, A., Lee, S., Neumann, U., and
Narayanan, S. (2004). Analysis of emotion recogni-
tion using facial expressions, speech and multimodal
information. In ICMI ’04: Proceedings of the 6th
international conference on Multimodal interfaces,
pages 205–211, New York, NY, USA. ACM.
Calvo, R. A. and D’Mello, S. (2010). Affect detection: An
interdisciplinary review of models, methods, and their
applications. IEEE Transaction on Affective Comput-
ing, 1:18–37.
Darwin, C. (1872). The expression of emotion in man and
animal. University of Chicago Press (reprinted in
1965), Chicago.
Ekman, P. (1999). Basic emotions, pages 301–320. Sussex
U.K.: John Wiley and Sons, Ltd, New York.
Ekman, P. and Friesen, W. V. (1978). Facial Action Cod-
ing System: A Technique for Measurement of Facial
Movement. Consulting Psychologists Press Palo Alto,
California.
Hammal, Z. and Massot, C. (2010). Holistic and feature-
based information towards dynamic multi-expressions
recognition. In VISAPP 2010. International Confer-
ence on Computer Vision Theory and Applications,
volume 2, pages 300–309.
Healey, J. and Picard, R. W. (2000). Smartcar: Detecting
driver stress. In In Proceedings of ICPR’00, pages
218–221, Barcelona, Spain.
Helmut, P., Junichiro, M., and Mitsuru, I. (2005). Recog-
nizing, modeling, and responding to users’ affective
states. In User Modeling, pages 60–69.
Lang, P. J., Bradley, M. M., and Cuthbert, B. N. (1999).
International affective picture system (iaps). Technical
manual and affective ratings.
Lisetti, C. and Nasoz, F. (2004). Using noninvasive wear-
able computers to recognize human emotions from
physiological signals. EURASIP J. Appl. Signal Pro-
cess, 2004:1672–1687.
Mehrabian, A. (1996). Pleasure-arousal-dominance: A gen-
eral framework for describing and measuring individ-
ual differences in temperament. Current Psychology,
14(4):261–292.
Paleari, M. and Lisetti, C. L. (2006). Toward multimodal fu-
sion of affective cues. In Proceedings of the 1
st
ACM
international workshop on Human-Centered Multime-
dia, pages 99–108, New York, NY, USA. ACM.
Pantic, M. and Rothkrantz, L. (2003). Toward an
affect-sensitive multimodal human-computer interac-
tion. volume 91, pages 1370–1390. Proceedings of the
IEEE.
Picard, R., Vyzas, E., and Healey, J. (2001). Toward
machine emotional intelligence: Analysis of affec-
tive physiological state. IEEE Transactions on Pat-
tern Analysis and Machine Intelligence, 23(10):1175–
1191.
Roy, D. and Pentland, A. (1996). Automatic spoken affect
classification and analysis. automatic face and gesture
recognition. In Proceedings of the 2nd International
Conference on Automatic Face and Gesture Recogni-
tion (FG ’96), pages 363–367, Washington, DC, USA.
IEEE Computer Society.
Scherer, K. R. (2003). Vocal communication of emotion:
A review of research paradigms. Speech Communica-
tion, 40(7-8):227–256.
Sebe, N., Cohen, I., and Huang, T. (2005). Multimodal
Emotion Recognition. World Scientific.
Tian, Y., Kanade, T., and Cohn, J. (2000). Recognizing
lower face action units for facial expression analysis.
pages 484–490. Proceedings of the 4th IEEE Inter-
national Conference on Automatic Face and Gesture
Recognition (FG’00).
Villon, O. (2007). Modeling affective evaluation of multi-
media contents: user models to associate subjective
experience, physiological expression and contents de-
scription. PhD thesis, Thesis.
Woolf, B., Burleson, W., Arroyo, I., Dragon, T., Cooper,
D., and Picard, R. (2009). Affect-aware tutors: recog-
nising and responding to student affect. Int. J. Learn.
Technol., 4(3/4):129–164.
GRAPP 2011 - International Conference on Computer Graphics Theory and Applications
360