FACIAL EXPRESSION RECOGNITION BASED ON FACIAL MUSCLES BEHAVIOR ESTIMATION

Saki Morita, Kuniaki Uehara

Abstract

Recent development in multimedia urges the need for an engineering study of the human face in communication media and man-machine interface. In this paper, we introduce a method not only for recognizing facial expression and human emotion, but for extracting rules from them as well. Facial data can be obtained by considering the relative position of each feature point in time series. Our approach estimates the behavior of muscles of facial expression from these data, and evaluates it to recognize facial expressions. In the recognition process, essential parameters that cause visible change of the face are extracted by estimating the force vectors of points on the face. The force vectors are calculated from displacements of points on the face by using FEM (Finite Element Method). To compare the multi-streams of force vectors of each facial expression effectively, A new similarity metric AMSS (Angular Metrics for Shape Similarity) is proposed. Finally, experiments of recognition of facial expressions shows that usable results are achieved even with few testees in our approach and variable rule corresponding AUs can be detected.

References

  1. Cook, R. D. (1995). Finite Element Modeling for Stress Analysis. Wiley.
  2. Ekman, P. and Friesen, W. (1978). The Facial Action Coding System. Consulting Psychologists Press.
  3. Essa, I. A. and Pentland, A. P. (1997). Coding, Analysis, Interpretation, and Recognition of Facial Expressions. IEEE Trans. Pattern Anal. Mach. Intell., 19(7):757- 763.
  4. Gunopoulos, D. (2002). Discovering Similar Multidimensional Trajectories. In Proc. of the 18th International Conference on Data Engineering, pages 673-684.
  5. Kohavi, R. and John, G. H. (1997). Wrappers for Feature Subset Selection. Artificial Intelligence, 97(1-2):273- 324.
  6. li Tian, Y., Kanade, T., and Cohn, J. F. (2001). Recognizing Action Units for Facial Expression Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(2):97-115.
  7. Lien, J. J., Kanade, T., Cohn, J. F., and Li, C. C. (1998). Automated Facial Expression Recognition Based on FACS Action Units. In Proc. of the 3rd. International Conference on Face & Gesture Recognition, pages 390-395.
  8. Sankoff, D. and Kruskal, E. J. B. (1983). Time Warps, String Edits, and Macromolecules: The Theory and Practice of Sequence Comparison. Addison-Wesley.
  9. Vlachos, M., Gunopulos, D., and Kollios, G. (2002). Robust Similarity Measures for Mobile Object Trajectories. In Proc. of the 13th International Workshop on Database and Expert Systems Applications, pages 721-728.
  10. Yacoob, Y. and Davis, L. (1994). Computing SpatioTemporal Representations of Human Faces. In Proc. of Computer Vision and Pattern Recognition 94, pages 70-75.
Download


Paper Citation


in Harvard Style

Morita S. and Uehara K. (2006). FACIAL EXPRESSION RECOGNITION BASED ON FACIAL MUSCLES BEHAVIOR ESTIMATION . In Proceedings of the First International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, ISBN 972-8865-40-6, pages 48-55. DOI: 10.5220/0001372400480055


in Bibtex Style

@conference{visapp06,
author={Saki Morita and Kuniaki Uehara},
title={FACIAL EXPRESSION RECOGNITION BASED ON FACIAL MUSCLES BEHAVIOR ESTIMATION},
booktitle={Proceedings of the First International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP,},
year={2006},
pages={48-55},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001372400480055},
isbn={972-8865-40-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the First International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP,
TI - FACIAL EXPRESSION RECOGNITION BASED ON FACIAL MUSCLES BEHAVIOR ESTIMATION
SN - 972-8865-40-6
AU - Morita S.
AU - Uehara K.
PY - 2006
SP - 48
EP - 55
DO - 10.5220/0001372400480055