Authors:
Hamzi Hamza
1
;
Paul Richard
2
;
Aymeric Suteau
2
and
Mehdi Saleh
3
Affiliations:
1
Université d’Angers and I-MAGINER, France
;
2
Université d’Angers, France
;
3
I-MAGINER, France
Keyword(s):
Virtual reality, Human-computer interaction, Emotion recognition, Affective computing, Job interview.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Emotion and Personality
;
Social Agents and Avatars
;
Social Agents in Computer Graphics
Abstract:
We present a multi-modal affective virtual environment (VE) for job interview training. The proposed platform aims to support real-time emotion-based simulations between an ECA and a human. The first goal is to train candidates (students, job hunters, etc.) to better master their emotional states and behavioral skills. The users’ emotional and behavior states will be assessed using different human-machine interfaces and biofeedback sensors. Collected data will be processed in real-time by a behavioral engine. A preliminary experiment was carried out to analyze the correspondence between the users’ perceived emotional states and the collected data. Participants were instructed to look at a series of sixty IAPS pictures and rate each picture on the following dimensions : joy, anger, surprise, disgust, fear and sadness.