Authors:
Zakia Hammal
1
and
Corentin Massot
2
Affiliations:
1
Université de Montréal, Canada
;
2
McGill University, Canada
Keyword(s):
Facial Expressions, Multiscale Spatial Filtering, Holistic Processing, Features Processing, Classification, TBM.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Feature Extraction
;
Features Extraction
;
Image and Video Analysis
;
Informatics in Control, Automation and Robotics
;
Signal Processing, Sensors, Systems Modeling and Control
Abstract:
Holistic and feature-based processing have both been shown to be involved differently in the analysis of facial expression by human observer. The current paper proposes a novel method based on the combination of both approaches for the segmentation of “emotional segments” and the dynamic recognition of the corresponding facial expressions. The proposed model is a new advancement of a previously proposed feature-based model for static facial expression recognition (Hammal et al., 2007). First, a new spatial filtering method is introduced for the holistic processing of the face towards the automatic segmentation of “emotional segments”. Secondly, the new filtering-based method is applied as a feature-based processing for the automatic and precise segmentation of the transient facial features and estimation of their orientation. Third, a dynamic and progressive fusion process of the permanent and transient facial feature deformations is made inside each “emotional segment” for a tempor
al recognition of the corresponding facial expression. Experimental results show the robustness of the holistic and feature-based analysis, notably for the analysis of multi-expression sequences. Moreover compared to the static facial expression classification, the obtained performances increase by 12% and compare favorably to human observers’ performances.
(More)