Authors:
Claudio Loconsole
1
;
Catarina Runa Miranda
2
;
Gustavo Augusto
2
;
Antonio Frisoli
1
and
Verónica Costa Orvalho
2
Affiliations:
1
PERCRO Laboratory and Scuola Superiore Sant'Anna, Italy
;
2
Universidade do Porto, Portugal
Keyword(s):
Human-computer interaction, Emotion Recognition, Computer Vision.
Related
Ontology
Subjects/Areas/Topics:
Applications and Services
;
Camera Networks and Vision
;
Computer Vision, Visualization and Computer Graphics
;
Enterprise Information Systems
;
Entertainment Imaging Applications
;
Features Extraction
;
Human and Computer Interaction
;
Human-Computer Interaction
;
Image and Video Analysis
;
Image Formation and Preprocessing
;
Image Formation, Acquisition Devices and Sensors
;
Medical Image Applications
;
Motion, Tracking and Stereo Vision
;
Optical Flow and Motion Analyses
;
Tracking and Visual Navigation
Abstract:
Facial emotions provide an essential source of information commonly used in human communication. For
humans, their recognition is automatic and is done exploiting the real-time variations of facial features. However,
the replication of this natural process using computer vision systems is still a challenge, since automation
and real-time system requirements are compromised in order to achieve an accurate emotion detection. In this
work, we propose and validate a novel methodology for facial features extraction to automatically recognize
facial emotions, achieving an accurate degree of detection. This methodology uses a real-time face tracker
output to define and extract two new types of features: eccentricity and linear features. Then, the features are
used to train a machine learning classifier. As result, we obtain a processing pipeline that allows classification
of the six basic Ekman’s emotions (plus Contemptuous and Neutral) in real-time, not requiring any manual
intervention or
prior information of facial traits.
(More)