CROSSMODAL PERCEPTION OF MISMATCHED EMOTIONAL EXPRESSIONS BY EMBODIED AGENTS

Yu Suk Cho, Ji He Suk, Kwang Hee Han

Abstract

Today an embodied agent generates a large amount of interest because of its vital role for human-human interactions and human-computer interactions in virtual world. A number of researchers have found that we can recognize and distinguish between emotions expressed by an embodied agent. In addition many studies found that we respond to simulated emotions in a similar way to human emotion. This study investigates interpretation of mismatched emotions expressed by an embodied agent (e.g. a happy face with a sad voice). The study employed a 4 (visual: happy, sad, warm, cold) X 4 (audio: happy, sad, warm, cold) within-subjects repeated measure design. The results suggest that people perceive emotions not depending on just one channel but depending on both channels. Additionally facial expression (happy face vs. sad face) makes a difference in influence of two channels; Audio channel has more influence in interpretation of emotions when facial expression is happy. People were able to feel other emotion which was not expressed by face or voice from mismatched emotional expressions, so there is a possibility that we may express various and delicate emotions with embodied agent by using only several kinds of emotions.

References

  1. Bartneck, C., 2001. Affective expressions of machines. Extended Abstracts of CHI'01:Conference on Human Factors in Computing Systems, 189-190.
  2. Bickmore, T., 2003. Relational Agents: Effecting Change through Human. Computer Relationships. Ph.D. Thesis, Department of Media Arts and Sciences, Massachusetts Institute of Technology Brave, S., Nass, C., Hutchinson, K., 2005. Computers that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent. International Journal of Human.Computer Studies, 62 (2), 161.178.
  3. Creed, C., Beale, R., 2008. Psychological responses to simulated displays of mismatched emotional expressions. Interacting with computers, 20, 225-239 De Gelder, B., Vroomen., J., 2000. The perception of emotions by ear and by eye. COGNITION AND EMOTION, 14 (3), 289-311.
  4. Ekman, P., Davidson, R.J., Friesen, W. V., 1990. The Duchenne smile: emotional expression and brain physiology. Journal of Personality and Social Psychology 58 (2), 342-353.
  5. Ekman, P., 2004. Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Henry Holt & Co.
  6. Heine, S. J., Lehman, D. R., Markus, H. R., Kitayama, S., 1999. Is there a universal need for positive selfregard. Psychological Review, 106, 766- 794.
  7. Russell, J. A., 1980. A circumplex model of affect. Journal of personality and social psychology, 39(6), 1161.
Download


Paper Citation


in Harvard Style

Suk Cho Y., He Suk J. and Hee Han K. (2009). CROSSMODAL PERCEPTION OF MISMATCHED EMOTIONAL EXPRESSIONS BY EMBODIED AGENTS . In Proceedings of the 11th International Conference on Enterprise Information Systems - Volume 5: ICEIS, ISBN 978-989-8111-88-3, pages 178-181. DOI: 10.5220/0001992001780181


in Bibtex Style

@conference{iceis09,
author={Yu Suk Cho and Ji He Suk and Kwang Hee Han},
title={CROSSMODAL PERCEPTION OF MISMATCHED EMOTIONAL EXPRESSIONS BY EMBODIED AGENTS},
booktitle={Proceedings of the 11th International Conference on Enterprise Information Systems - Volume 5: ICEIS,},
year={2009},
pages={178-181},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001992001780181},
isbn={978-989-8111-88-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th International Conference on Enterprise Information Systems - Volume 5: ICEIS,
TI - CROSSMODAL PERCEPTION OF MISMATCHED EMOTIONAL EXPRESSIONS BY EMBODIED AGENTS
SN - 978-989-8111-88-3
AU - Suk Cho Y.
AU - He Suk J.
AU - Hee Han K.
PY - 2009
SP - 178
EP - 181
DO - 10.5220/0001992001780181