TOWARDS AUTOMATED INFERENCING OF EMOTIONAL STATE FROM FACE IMAGES

Ioanna-Ourania Stathopoulou, George A. Tsihrintzis

Abstract

Automated facial expression classification is very important in the design of new human-computer interaction modes and multimedia interactive services and arises as a difficult, yet crucial, pattern recognition problem. Recently, we have been building such a system, called NEU-FACES, which processes multiple camera images of computer user faces with the ultimate goal of determining their affective state. In here, we present results from an empirical study we conducted on how humans classify facial expressions, corresponding error rates, and to which degree a face image can provide emotion recognition from the perspective of a human observer. This study lays related system design requirements, quantifies statistical expression recognition performance of humans, and identifies quantitative facial features of high expression discrimination and classification power.

References

  1. Csikszentmihalyi M., (1994) Flow: The Psychology of Optimal Experience, Harper and Row, New York.
  2. De Silva L.C., Miyasato T., and Nakatsu R., (1997) Facial Emotion Recognition Using Multimodal Information, in Proc. IEEE Int. Conf. on Information, Communications and Signal Processing - ICICS Singapore, pp397-401, Sept. 1997.
  3. Ekman P., (1999) “The Handbook of Cognition and Emotion”, T. Dalgleish and T. Power (Eds.) Pp. 45-60. Sussex, U.K.: John Wiley & Sons, Ltd.
  4. Ekman P. and Friesen W., (1975) “Unmasking the Face”, Englewood Cliffs, NJ: Prentice-Hall.
  5. Ekman P., (1982) “Emotion In the Human Face” Cambridge: Cambridge University Press (1982).
  6. Ekman, P., Campos, J., Davidson R.J., De Waals, F., (2003) “Darwin, Deception, and Facial Expression”, Emotions Inside Out, Volume 1000. New York: Annals of the New York Academy of Sciences.
  7. Ortony A., Clore G. L., & Collins A., (1988) The Cognitive Structure of Emotions, Cambridge University Press.
  8. Pantic, M. & Rothkrantz, L.J.M. (2000) Automatic Analysis of Facial Expressions: The State of the Art. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(12), 1424-1445.
  9. Pantic M. & Rothkrantz .L.J.M, (2003) “Toward an affectsensitive multimodal HCI”, Proceedings of the IEEE, vol. 91, no. 9, pp.1370-1390.
  10. Pantic M., Valstar M.F., Rademaker R. and Maat L., (2005) “Web-based Database for Facial Expression Analysis”, Proc. IEEE Int'l Conf. Multmedia and Expo (ICME'05), Amsterdam, The Netherlands, July 2005 Picard R.W., (1997) Affective Computing, Cambridge, The MIT Press.
  11. Picard R.W., (2003), "Affective Computing: Challenges," International Journal of Human-Computer Studies, Volume 59, Issues 1-2, July 2003, pp. 55-64.
  12. Reeves, B. and Nass, C., Social and Natural Interfaces: Theory and Design. CHI Extended Abstracts 1997: 192- 193.
  13. Rosenberg, M., (1979) Conceiving the Self, Basic Books, New York.
  14. Russell, J. A., (2003) Core affect and the psychological construction of emotion. Psychological Review, 110, 145-172.
  15. Russell, J. A., (1994) “Is there universal recognition of emotion from facial expression?: A review of the cross-cultural studies”, Psychological Bulletin, 115, 102-14.
  16. Stathopoulou I.-O. and Tsihrintzis G.A.(2004), “A neural network-based facial analysis system,” 5th International Workshop on Image Analysis for Multimedia Interactive Services, Lisboa, Portugal, April 21-23, 2004.
  17. Stathopoulou I.-O. and Tsihrintzis G.A.(2004), “An Improved Neural Network-Based Face Detection and Facial Expression Classification System,” IEEE International Conference on Systems, Man, and Cybernetics 2004, The Hague, Netherlands, October 10-13, 2004.
  18. Stathopoulou I.-O. and Tsihrintzis G.A.(2005), “Preprocessing and expression classification in low quality face images”, 5th EURASIP Conference on Speech and Image Processing, Multimedia Communications and Services, Smolenice, Slovak Republic, June 29 - July 2, 2005.
  19. Stathopoulou I.-O. and Tsihrintzis G.A.(2007), NEUFACES: A Neural Network-based Face Image Analysis System, 8th International Conference on Adaptive and Natural Computing Systems, Warsaw, Poland, April 11-14, 2007.
Download


Paper Citation


in Harvard Style

Stathopoulou I. and A. Tsihrintzis G. (2007). TOWARDS AUTOMATED INFERENCING OF EMOTIONAL STATE FROM FACE IMAGES . In Proceedings of the Second International Conference on Software and Data Technologies - Volume 1: ICSOFT, ISBN 978-989-8111-05-0, pages 206-211. DOI: 10.5220/0001329802060211


in Bibtex Style

@conference{icsoft07,
author={Ioanna-Ourania Stathopoulou and George A. Tsihrintzis},
title={TOWARDS AUTOMATED INFERENCING OF EMOTIONAL STATE FROM FACE IMAGES},
booktitle={Proceedings of the Second International Conference on Software and Data Technologies - Volume 1: ICSOFT,},
year={2007},
pages={206-211},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001329802060211},
isbn={978-989-8111-05-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Second International Conference on Software and Data Technologies - Volume 1: ICSOFT,
TI - TOWARDS AUTOMATED INFERENCING OF EMOTIONAL STATE FROM FACE IMAGES
SN - 978-989-8111-05-0
AU - Stathopoulou I.
AU - A. Tsihrintzis G.
PY - 2007
SP - 206
EP - 211
DO - 10.5220/0001329802060211