Authors:
Masahiro Yasuda
;
Zhang Dou
and
Minoru Nakayama
Affiliation:
Tokyo Institute of Technology, Japan
Keyword(s):
Human Emotion, Facial Expression, EEG, ERP, Chronological Analysis, Prediction.
Related
Ontology
Subjects/Areas/Topics:
Applications and Services
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computer Vision, Visualization and Computer Graphics
;
Informatics in Control, Automation and Robotics
;
Medical Image Detection, Acquisition, Analysis and Processing
;
Signal Processing, Sensors, Systems Modeling and Control
;
Time and Frequency Response
;
Time-Frequency Analysis
Abstract:
To assess human emotion using electroencephalograms (EEGs), the relationship between emotional impressions of images of facial expressions and features of Event Related Potentials (ERPs) recorded using three electrodes was analyzed. First, two clusters of emotional impressions were extracted using two-dimensional responses of the Affect Grid scale. Second, features of ERPs in response to the two clusters were examined. Time slots where amplitude differences in ERP appeared were measured, and differences in the frequency power of ERP were also extracted for each electrode. To evaluate these features, prediction performance for the two clusters was examined using discriminant analysis of the features. Also, the dependency of some band pass filters was measured.