produced at each step. Using this method, our
biosignal classification system produced an average
recognition rate of 90% on the four emotional states.
Table 1 shows the confusion matrix for the
classification.
Table 1: LDA classifier confusion matrix.
I/O JO AN SA PL %
JO 0.96 0 0 0.04 96
AN 0 1.00 0 0 100
SA 0.04 0 .92 0.04 92
PL 0.12 0 0.16 0.72 72
5 CONCLUSIONS
A novel emotion elicitation scheme based on self-
generated emotions is presented, engendering a high
degree of confidence in collected, emotionally
relevant, biosignals. Discrete state recognition via
physiological signal analysis, using pattern
recognition and signal processing, is shown to be
highly accurate. A correct average recognition rate
of 90% is achieved using sequential forward
selection and Fisher dimensionality reduction,
coupled with a Linear Discriminant Analysis
classifier.
We believe that the high classification rate is due
in part to our use of a professional method actor as
test subject. It is speculated that normal subjects
would lead to lower rates because of the high
variability of emotion expressivity across a large
population pool. It is an avenue of research for us to
test the generalization of this type of machine-based
emotion recognition.
Our ongoing research also intends to support
real-time classification of discrete emotional states.
Specifically, continuous arousal/valence mappings
from biosignals will drive our emotional-imaging
generator for multimedia content synthesis and
control in a theatrical performance context. In
addition, we are exploring the therapeutic and
performance training possibilities of our system.
Because what we are building is fundamentally an
enriched biofeedback device, we anticipate
applications ranging from stress reduction for the
general population to the generation of concrete
emotional expression for those with autism or other
communication disorders.
ACKNOWLEDGEMENTS
The authors wish to thank the Natural Sciences and
Engineering Research Council of Canada (NSERC)
New Media Initiative and the Centre for
Interdisciplinary Research in Music Media and
Technology at McGill University for their funding
support for this research. Special thanks are also
due to Laurence Dauphinais, who gave many hours
of her time and her artistic insight, and to Thought
Technology Ltd., which provided the acquisition
hardware and software used in this research.
REFERENCES
Anders S., Lotze M., Erb M., Grodd W., Birbaumer N.,
2004. Brain activity underlying Emotional valence and
arousal: A response-related fMRI study. Human Brain
Mapping, Vol. 23, p. 200-209.
Bartlett, M.S., Hager, J.C., Ekman, P., Sejnowski, T.J.,
1999. Measuring facial expressions by computer
image analysis. Psychophysiology, Vol. 36, p. 253-
263.
Black, M.J., Yacoob, Y., 1995. Recognizing facial
expressions in image sequences using local
parameterized models of image motion. ICCV.
Cacioppo, J., Tassinary, L.G., 1990. Inferring
psychological significance from physiological signals.
American Psychologist, Vol 45, p. 16-28.
Ekman, P., Levenson, R.W., Friesen, W.V., 1983.
Autonomic Nervous System Activity Distinguishes
Between Emotions. Science, 221 (4616), p. 1208-1210.
Ekman P., 2005. Emotion in the human face, Cambridge
University Press, p. 39-55.
Lyons, M. Budynek, J., Akamatsu, S. 1999. Automatic
Classification of Single Facial Images. IEEE PAMI,
vol. 21, no. 12.
Oppenheim, A.V., Schafer, R.W., 1989. Discrete-Time
Signal Processing, Englewood Cliffs, N.J.: Prentice-
Hall.
Picard, R.W., Vyzas, E., Healey, J., 2001. Toward
machine emotional intelligence: analysis of affective
physiological state. IEEE Transactions on Pattern
Analysis and Machine Intelligence, Volume 23, Issue
10, p. 1175 – 1191.
Posner J., Russell J.A., Peterson B.S., 2005. The
circumplex model of affect: an integrative approach to
affective neuroscience, cognitive development, and
psychopathology. Development and Psychopatholy, p.
715-734.
Ververidis, D., Kotropoulos, C., Pitas, I., 2004. Automatic
emotional speech classification, IEEE ICASSP.
Watanuki S., Kim Y.K., 2005. Physiological responses
induced by pleasant stimuli. Journal of Physiological
Anthropology and Applied Human Science, p. 135-
138.
BIOSIGNALS 2008 - International Conference on Bio-inspired Systems and Signal Processing
258