Authors:
Carolain Anto-Chavez
;
Richard Maguiña-Bernuy
and
Willy Ugarte
Affiliation:
Universidad Peruana de Ciencias Aplicadas, Lima, Peru
Keyword(s):
Facial, Emotion, Expression, Recognition, Machine Learning, Real-Time, Mobile, FER.
Abstract:
Every year, the increase in human-computer interaction is noticeable. This brings with it the evolution of computer vision to improve this interaction to make it more efficient and effective. This paper presents a CNN-based emotion face recognition model capable to be executed on mobile devices, in real time and with high accuracy. Different models implemented in other research are usually of large sizes, and although they obtained high accuracy, they fail to make predictions in an optimal time, which prevents a fluid interaction with the computer. To improve these, we have implemented a lightweight CNN model trained with the FER-2013 dataset to obtain the prediction of seven basic emotions. Experimentation shows that our model achieves an accuracy of 66.52% in validation, can be stored in a 13.23MB file and achieves an average processing time of 14.39ms and 16.06ms, on a tablet and a phone, respectively.