Facial Signs and Psycho-physical Status Estimation for Well-being Assessment

F. Chiarugi, G. Iatraki, E. Christinaki, D. Manousos, G. Giannakakis, M. Pediaditis, A. Pampouchidou, K. Marias, M. Tsiknakis

2014

Abstract

Stress and anxiety act as psycho-physical factors that increase the risk of developing several chronic diseases. Since they appear as early indicators, it is very important to be able to perform their evaluation in a contactless and non-intrusive manner in order to avoid inducing artificial stress or anxiety to the individual in question. For these reasons, this paper analyses the methodologies for the extraction of respective facial signs from images or videos, their classification and techniques for coding these signs into appropriate psycho-physical statuses. A review of existing datasets for the assessment of the various methodologies for facial expression analysis is reported. Finally, a short summary of the most interesting findings in the various stages of the procedure are indicated with the aim of achieving new contactless methods for the promotion of an individual’s well-being.

References

  1. Aleksic, P.S. and Katsaggelos, A.K., 2006. Automatic facial expression recognition using facial animation parameters and multistream HMMs. IEEE Trans on Information Forensics and Security, 1(1): 3-11.
  2. AVEC2013. 3rd International Audio/Visual Emotion Challenge and Workshop. [Online] Available at: http://sspnet.eu/avec2013/.
  3. Boashash, B., Mesbah, M. and Colditz, P.G., 2003. Timefrequency detection of EEG abnormalities. In Timefrequency signal analysis and processing: A comprehensive reference. Elsevier Science, pp. 663- 670.
  4. Chiarugi, F. et al., 2013. A virtual individual's model based on facial expression analysis: a non-intrusive approach for wellbeing monitoring and selfmanagement. 13th IEEE International Conference on BioInformatics and BioEngineering.
  5. Cootes, T.F., Edwards, G.J. and Taylor, C.J., 2001. Active appearance models. IEEE Trans Pattern Anal Mach Intell, 23(6): 681-685.
  6. De la Torre, F. and Cohn, J.F., 2011. Facial expression analysis. In Visual Analysis of Humans. Springer London, pp. 377-409.
  7. Donato, G., Bartlett, M.S., Hager, J.C., Ekman, P. and Sejnowski, T.J., 1999. Classifying facial actions. IEEE Trans Pattern Anal Mach Intell, 21(10): 974-989.
  8. Eisert, P. and Girod, B., 1997. Facial expression analysis for model-based coding of video sequences. ITG FACHBERICHT: 33-38.
  9. Ekman, P. and Friesen, W.V., 2003. Unmasking the face: A guide to recognizing emotions from facial clues. ISHK.
  10. Ekman, P., Friesen, W.V. and Hager, J.C., 2002. Facial action coding system. The manual on CD-ROM.
  11. Essa, I.A. and Pentland, A.P., 1997. Coding, analysis, interpretation, and recognition of facial expressions. IEEE Trans Pattern Anal Mach Intell, 19(7): 757-763.
  12. Etemad, K. and Chellappa, R., 1997. Discriminant analysis for recognition of human face images. JOSA A, 14(8): 1724-1733.
  13. Feng, X., Pietikainen, M. and Hadid, A., 2005. Facial expression recognition with local binary patterns and linear programming. Pattern Recognition And Image Analysis C/C of Raspoznavaniye Obrazov I Analiz Izobrazhenii, 15(2): 546.
  14. Friesen, W.V. and Ekman, P., 1983. Emfacs-7: emotional facial action coding system. Unpublished manuscript, University of California at San Francisco, 2.
  15. Hamilton, M., 1959. The assessment of anxiety-states by rating. British Journal of Medical Psychology, 32(1): 50-55.
  16. Hamm, J., Kohler, C.G., Gur, R.C. and Verma, R., 2011. Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders. J Neurosci Methods, 200(2): 237-256.
  17. Handel, 2011. The Emotion Machine. (Online) Available at: http://www.theemotionmachine.com/classificationof-emotions.
  18. Harrigan, J.A. and O'Connell, D.M., 1996. How do you look when feeling anxious? Facial displays of anxiety. Personality and Individual Differences, 21(2): 205- 212.
  19. Huang, C.L. and Huang, Y.M., 1997. Facial expression recognition using model-based feature extraction and action parameters classification. Journal of Visual Communication and Image Representation, 8(3): 278- 290.
  20. Hyvarinen, A., 1999. Survey on independent component analysis. Neural computing surveys, 2(4): 94-128.
  21. Izard, C.E., 1979. The maximally discriminative facial movement coding system (MAX). Newark: University of Delaware Instructional Resources Center.
  22. Jolliffe, I., 2005. Principal component analysis. Wiley Online Library.
  23. Kämäräinen, J.-K., Hadid, A. and Pietikäinen, M., 2011. Local representation of facial features, Handbook of Face Recognition. Springer, pp. 79-108.
  24. Kazuhito, M.O.K., 2013. Analysis of psychological stress factors and facial parts effect on international facial expressions. 3rd International Conference on Ambient Computing, Applications, Services and Technologies, pp. 10.
  25. Kirschbaum, C., Pirke, K.M. and Hellhammer, D.H., 1993. The 'Trier Social Stress Test'-a tool for investigating psychobiological stress responses in a laboratory setting. Neuropsychobiology, 28(1-2): 76- 81.
  26. Koelstra, S. et al., 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Trans on Affective Computing, 3(1): 18-31.
  27. Kotsia, I., Buciu, I. and Pitas, I., 2008. An analysis of facial expression recognition under partial facial image occlusion. Image and Vision Computing, 26(7): 1052- 1067.
  28. Lerner, J.S., Dahl, R.E., Hariri, A.R. and Taylor, S.E., 2007. Facial expressions of emotion reveal neuroendocrine and cardiovascular stress responses. Biol Psychiatry, 61(2): 253-260.
  29. Lewis, M.D., Haviland-Jones, J.M. and Barrett, L.F., 2008. Handbook of emotions. Guilford Press.
  30. Littlewort, G., Bartlett, M.S., Fasel, I., Susskind, J. and Movellan, J., 2006. Dynamics of facial expression extracted automatically from video. Image and Vision Computing, 24(6): 615-625.
  31. Lox, C.L., 1992. Perceived threat as a cognitive component of state anxiety and confidence. Perceptual and Motor Skills, 75(3): 1092-1094.
  32. Madokoro, H. and Sato, K., 2010. Estimation of psychological stress levels using facial expression spatial charts. 2010 IEEE International Conference on Systems Man and Cybernetics (SMC): 2814-2819.
  33. Mase, K. and Pentland, A., 1991. Automatic lipreading by optical-flow analysis. Systems and Computers in Japan, 22(6): 67-76.
  34. McIntyre, G.J., 2010. The computer analysis of facial expressions: on the example of depression and anxiety. PhD Thesis. College of Engineering and Computer Science, The Australian National University, Canberra.
  35. Ojala, T., Pietikainen, M. and Harwood, D., 1996. A comparative study of texture measures with classification based on feature distributions. Pattern Recognition, 29(1): 51-59.
  36. Pandzic, I.S. and Forchheimer, R., 2003. MPEG-4 facial animation: the standard, implementation and applications. Wiley.
  37. Pantic, M. and Patras, I., 2006. Dynamics of facial expression: Recognition of facial actions and their temporal segments from face profile image sequences. IEEE Trans on Systems Man and Cybernetics Part BCybernetics, 36(2): 433-449.
  38. Pantic, M. and Rothkrantz, L.J.M., 2000. Expert system for automatic analysis of facial expressions. Image and Vision Computing, 18(11): 881-905.
  39. Parrott, W., 2001. Emotions in social psychology: Essential readings. Psychology Press.
  40. Perkins, A.M., Inchley-Mort, S.L., Pickering, A.D., Corr, P.J. and Burgess, A.P., 2012. A facial expression for anxiety. J Pers Soc Psychol, 102(5): 910-924.
  41. Pighin, F., Hecker, J., Lischinski, D., Szeliski, R. and Salesin, D.H., 2006. Synthesizing realistic facial expressions from photographs. ACM SIGGRAPH 2006 Courses. ACM, 19.
  42. Plutchik, R., 2001. The nature of emotions - Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist, 89(4): 344-350.
  43. Rosenblum, M., Yacoob, Y. and Davis, L.S., 1996. Human expression recognition from motion using a radial basis function network architecture. IEEE Trans on Neural Networks, 7(5): 1121-1138.
  44. Rowa, K. and Antony, M.M., 2008. Generalized anxiety disorder. Psychopathology: History, Diagnosis, and Empirical Foundations: 78.
  45. Rydfalk, M., 1978. CANDIDE: A Parameterized Face. Linkoping Image Coding Group.
  46. Shan, C.F., Gong, S.G. and McOwan, P.W., 2009. Facial expression recognition based on local binary patterns: A comprehensive study. Image and Vision Computing, 27(6): 803-816.
  47. Sharma, N. and Gedeon, T., 2012. Objective measures, sensors and computational techniques for stress recognition and classification: A survey. Computer Methods and Programs in Biomedicine, 108(3): 1287- 1301.
  48. Sharma, N. and Gedeon, T., 2014. Modeling a stress signal. Applied Soft Computing, 14: 53-61.
  49. Simpson, H.M. and Molloy, F.M., 1971. Effects of audience anxiety on pupil size. Psychophysiology, 8(4): 491-496.
  50. Soleymani, M., Lichtenauer, J., Pun, T. and Pantic, M., 2012. A multimodal database for affect recognition and implicit tagging. IEEE Trans on Affective Computing, 3(1): 42-55.
  51. Stress.org 50 common signs and symptoms of stress. (Online) Available at: http://www.stress.org/stresseffects.
  52. Terzopoulos, D. and Waters, K., 1993. Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans Pattern Anal Mach Intell, 15(6): 569-579.
  53. Tong, Y., Chen, J.X. and Ji, Q., 2010. A unified probabilistic framework for spontaneous facial action modeling and understanding. IEEE Trans Pattern Anal Mach Intell, 32(2): 258-273.
  54. Tsalakanidou, F. and Malassiotis, S., 2010. Real-time 2D+3D facial action and expression recognition. Pattern Recognition, 43(5): 1763-1775.
  55. Valstar, M. and Pantic, M., 2010. Induced disgust, happiness and surprise: an addition to the mmi facial expression database. International Conference in Language Resources and Evaluation, Workshop on EMOTION, pp. 65-70.
  56. van der Schalk, S.T., Fischer, A.H. and Doosje, B., 2011. Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion, 11(4): 907-920.
  57. Whitehill, J., Bartlett, M.S. and Movellan, J.R., 2013. Automatic facial expression recognition. Social Emotions in Nature and Artifact: 88.
  58. Wiemann, J.M., 1976. Unmasking face. In Ekman, P. and Friesen, W.V. Journal of Communication, 26(3): 226- 227.
  59. Zhang, Y. and Ji, Q., 2005. Active and dynamic information fusion for facial expression understanding from image sequences. IEEE Trans on Pattern Analysis and Machine Intelligence, 27(5): 699-714.
  60. Zhang, Y.M. and Ji, Q., 2006. Active and dynamic information fusion for multisensor systems with dynamic Bayesian networks. IEEE Trans on Systems Man and Cybernetics Part B-Cybernetics, 36(2): 467- 472.
  61. Zhang, Z., Lyons, M., Schuster, M. and Akamatsu, S., 1998. Comparison between geometry-based and Gabor-wavelets-based facial expression recognition using multi-layer perceptron. 3rd IEEE International Conference on Automatic Face and Gesture Recognition: 454-459.
  62. Zhao, X.M. and Zhang, S.Q., 2011. Facial expression recognition based on local binary patterns and kernel discriminant isomap. Sensors, 11(10): 9573-9588.
Download


Paper Citation


in Harvard Style

Chiarugi F., Iatraki G., Christinaki E., Manousos D., Giannakakis G., Pediaditis M., Pampouchidou A., Marias K. and Tsiknakis M. (2014). Facial Signs and Psycho-physical Status Estimation for Well-being Assessment . In Proceedings of the International Conference on Health Informatics - Volume 1: SUPERHEAL, (BIOSTEC 2014) ISBN 978-989-758-010-9, pages 555-562. DOI: 10.5220/0004934405550562


in Bibtex Style

@conference{superheal14,
author={F. Chiarugi and G. Iatraki and E. Christinaki and D. Manousos and G. Giannakakis and M. Pediaditis and A. Pampouchidou and K. Marias and M. Tsiknakis},
title={Facial Signs and Psycho-physical Status Estimation for Well-being Assessment},
booktitle={Proceedings of the International Conference on Health Informatics - Volume 1: SUPERHEAL, (BIOSTEC 2014)},
year={2014},
pages={555-562},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004934405550562},
isbn={978-989-758-010-9},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Health Informatics - Volume 1: SUPERHEAL, (BIOSTEC 2014)
TI - Facial Signs and Psycho-physical Status Estimation for Well-being Assessment
SN - 978-989-758-010-9
AU - Chiarugi F.
AU - Iatraki G.
AU - Christinaki E.
AU - Manousos D.
AU - Giannakakis G.
AU - Pediaditis M.
AU - Pampouchidou A.
AU - Marias K.
AU - Tsiknakis M.
PY - 2014
SP - 555
EP - 562
DO - 10.5220/0004934405550562