• Improve the interaction with interfaces and
virtual assistants on-board.
Therefore, this empathic approach presented will
revolutionize the transportation through the analysis
of physiological (biometric) and contextual
(environmental) data, fused together to improve the
understanding the emotional state of the passenger.
This will allow to respond to the user in an
appropriate way, establishing trust between people
and automated vehicles, as well as enhance the in-
cabin and personalized experience. Furthermore, the
empathic module can be applied, not only to cars but
also to other means of transport such as buses, trucks,
planes, ships, to understand the experience of the
occupants and provide tailored services accordingly.
5 CONCLUSIONS
The results of the current experiment validate that it
is possible to detect changes in the state of the
occupants on board from physiological signals. The
extraction and analysis of key parameters of ECG and
EMG allows to obtain the values of Arousal and
Valence and therefore estimate their emotional state.
These results have positive implications for the
automobile industry enabling that CAVs will
understand how we feel and use such information to
make system more empathic, responding to the
occupant emotions in real time, and therefore
enhancing the CAV acceptance.
Future tests with subjects in the immersive
Human Autonomous Vehicle (HAV) will allow to
generate in SUaaVE project a reliable emotional
model, being more sensitive to differences in gender
perspective, driving experience and personal profile.
ACKNOWLEDGEMENTS
The paper presents the overall objective and the
methodology of the project SUaaVE (SUpporting
acceptance of automated VEhicle), funded from the
European Union’s Horizon 2020 Research and
Innovation Programme under Grant Agreement No
814999.
REFERENCES
Bradley, M. M., & Lang, P. J. (1994). Measuring emotion:
The self-assessment manikin and the semantic
differential. Journal of behavior therapy and
experimental psychiatry, 25(1), 49–59.
Chanel, G., Ansari-Asl, K., & Pun, T. (2007). Valence-
arousal evaluation using physiological signals in an
emotion recall paradigm. 2007 IEEE International
Conference on Systems, Man and Cybernetics, 2662–
2667.
Eyben, F., Wöllmer, M., Poitschke, T., Schuller, B.,
Blaschke, C., Färber, B., & Nguyen-Thien, N. (2010).
Emotion on the road—Necessity, acceptance, and
feasibility of affective computing in the car. Advances
in human-computer interaction, 2010.
Harmon-Jones, E., & Winkielman, P. (2007). Social
neuroscience: Integrating biological and psychological
explanations of social behavior. Guilford Press.
Holzinger, A., Bruschi, M., & Eder, W. (2013). On
interactive data visualization of physiological low-cost-
sensor data with focus on mental stress. International
Conference on Availability, Reliability, and Security,
469–480.
Jirayucharoensak, S., Pan-Ngum, S., & Israsena, P. (2014).
EEG-based emotion recognition using deep learning
network with principal component based covariate shift
adaptation. The Scientific World Journal, 2014.
Laparra Hernández, J., Izquierdo Riera, M. D., Medina
Ripoll, E., Palomares Olivares, N., & Solaz Sanahuja,
J. S. (2019). Dispositivo Y Procedimiento De Vigilancia
Del Ritmo Respiratorio De Un Sujeto.
https://patentscope.wipo.int/search/es/detail.jsf?docId=
WO2019145580&tab=PCTBIBLIO
Russell, J. A. (1980). A circumplex model of affect. Journal
of personality and social psychology, 39(6), 1161.
Sensum. (2020, September 21). Empathic AI for smart
mobility, media & technology (https://sensum.co/)
[Text/html]. Sensum; Sensum. https://sensum.co
Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X.,
& Yang, X. (2018). A review of emotion recognition
using physiological signals. Sensors, 18(7), 2074.
Stickel, C., Ebner, M., Steinbach-Nordmann, S., Searle, G.,
& Holzinger, A. (2009). Emotion detection:
Application of the valence arousal space for rapid
biological usability testing to enhance universal access.
International Conference on Universal Access in
Human-Computer Interaction, 615–624.
Tan, H., & Zhang, Y.-J. (2006). Detecting eye blink states
by tracking iris and eyelids. Pattern Recognition
Letters, 27(6), 667–675.
Valenza, G., Citi, L., Lanatá, A., Scilingo, E. P., & Barbieri,
R. (2014). Revealing real-time emotional responses: A
personalized assessment based on heartbeat dynamics.
Scientific reports
, 4, 4998.