Contagion of Physiological Correlates
of Emotion between Performer and Audience:
An Exploratory Study
Javier Jaimovich, Niall Coghlan and R. Benjamin Knapp
Sonic Arts Research Centre, Queen’s University Belfast
University Road, BT7 1NN, U.K.
Abstract. Musical and performance experiences are often described as evoking
powerful emotions, both in the listener/observer and player/performer. There is
a significant body of literature describing these experiences along with related
work examining physiological changes in the body during music listening and
the physiological correlates of emotional state. However there are still open
questions as to how and why, emotional responses may be triggered by a per-
formance, how audiences may be influenced by a performers mental or emo-
tional state and what effect the presence of an audience has on performers. We
present a pilot study and some initial findings of our investigations into these
questions, utilising a custom software and hardware system we have developed.
Although this research is still at a pilot stage, our initial experiments point to-
wards significant correlation between the physiological states of performers and
audiences and we here present the system, the experiments and our preliminary
data.
1 Introduction
As computers and mobile devices become simultaneously smaller and more powerful
they are also being incorporated into everyday objects and our day to day lives. This
move towards ‘ubiquitous’ computing means that these embedded devices are used in
very diverse situations and environments, that may require some degree of context
awareness from the device. One branch of research into context aware interactions is
what is known as ‘affective’ computing, using emotion or ‘affect’ as an information
and interaction channel for an electronic device or system [1]. This may allow appro-
priate responses from a computer system based on factors such as happiness, sadness,
frustration or stress. Machine recognition of emotional state is not a trivial matter and
research continues in a number of directions, such as facial emotion recognition,
vocal analysis and posture analysis [2]. Our research has chiefly focused on physio-
logical indicators of emotion (biosignals) and changes in emotional state such as
patterns in heart rate variability and galvanic skin response [3]. While it is difficult to
attempt to assign a given emotion to a particular physiological state, as opposed to
say facial indicators of emotion, biosignals do have the advantage of being largely
outside conscious control and thus may be viewed as a more ‘direct’ connection with
Jaimovich J., Coghlan N. and Benjamin Knapp R. (2010).
Contagion of Physiological Correlates of Emotion between Performer and Audience: An Exploratory Study.
In Proceedings of the 1st International Workshop on Bio-inspired Human-Machine Interfaces and Healthcare Applications, pages 67-74
DOI: 10.5220/0002814200670074
Copyright
c
SciTePress
the subject. It is also relatively easy to detect subtle continuous changes in physio-
logical (and by extension emotional) state, allowing for more nuanced interactions.
2 Music and Emotion
Emotions are a powerful force in driving human decision making and action, fre-
quently overpowering intellect or logical reasoning [4] and with the capability to
affect our interpretation and perception of events or content [5]. Music has been
shown to have the capability to induce emotions in the listener [6] with corresponding
physical [7] and physiological effects [8].
While most previous research into the emotional power of music has focused on
structural and cognitive aspects, the neuropsychological underpinnings are only now
being properly explored, with alternative mechanisms, such as those posited by Juslin
and Västfjäll in [9], such as brain stem reflexes, visual imagery, episodic memory and
musical expectancy now thought to also play a role in evocation of emotion. Also
among these alternatives is the possibility of emotional contagion, in which emotion
is engendered in the listener corresponding to the perceived emotional content or
intent of the music, such as a dissonant piece with harsh timbres and fast tempo sug-
gesting anger. There is some evidence to support this induction of mood through
perceived affect of musical stimuli [10] and listeners often report a sensation of
‘chills’ from particular pieces of music with a strong personal or emotional cachet
[11] that may also be an indicator of emotional peak experiences.
However most of these experiments have taken place in laboratory settings using pre-
recorded musical examples and on a one-to-one basis, with little consistency in sub-
jects responses to given pieces of music. So far little work has been done (on a
physiological level) in examining group experiences of musical performance in a
concert setting and it is our belief that more data gathered simultaneously from multi-
ple participants in this ecological setting may shed some light on what causes and
modulates our emotional responses to music.
We also hypothesise that there is a degree of emotional contagion between the per-
former and the audience, with a performer/players affective state influencing the
affective state of the audience. This may be observed through channels such as the
previously mentioned facial or posture indicators, through affective modulation of
performance style and technique or through as yet unknown channels of affective
communication.
It is important to stress that using low level biosignals like GSR or HR we are unable
to definitively infer a given affective state in the subject monitored, such as happiness
or boredom. We are however able to detect gross changes in state and to suggest a
probability of a given state (with accuracy dependent on variables such as number of
signals monitored and context). There may also be variables external to the monitor-
ing environment (for example events that occurred during the subjects day prior to
monitoring or feelings of illness) that will affect the biosignal readings.
68
3 Methodology
We carried out two experiments in different live performance environments, with
separate subject groups. 9 random audience members were selected in each session to
participate in the experiments, who were invited to sit in chairs augmented with sen-
sors to detect physiological signals [12]. The musical program included 3 contempo-
rary pieces during which the performers were measured with bio-sensors; a piano
improvisation (12min), an interactive electronic piece ‘Stem Cells’ (12min) and an
electroacoustic piece diffused by the composer ‘Imago’ (25min). The three perform-
ers’ biosignals were recorded simultaneously with the audience.
Fig. 1. ECG (HRV) and GSR sensors used audience biosignals recording.
During the experiments we monitored two physiological signals known to be corre-
lates of emotional state: Galvanic Skin Response (GSR) and Heart Rate Variability
(HRV). GSR, also known as electrodermal response, is a method for measuring the
conductance of the skin using an ohmmeter. Electrodes are situated in the palms or
fingertips of the hands, where the eccrine glands, regulated by the sympathetic nerv-
ous system (SNS), produce sweat that varies the conductivity measured by the ohm-
meter. Although one of the main evolutionary functions of the SNS is to regulate
body temperature, since early studies researchers have correlated changes in GSR to
different stimuli associated with emotional responses, such as film [13] and music
[7].
HRV is a feature extracted from an electrocardiogram (ECG) signal, which measures
the electrical impulses produced by the heart during each beat. Heart Rate Variability
refers specifically to the changes in the beat-to-beat interval of the heart. In other
words, a heart rate of 70 beats per minute (bpm) is an average over time of fluctua-
tions between successive heartbeats, and these may vary significantly from the 70
bpm. Several studies have observed patterns in HRV that are associated with emo-
tional states, yet there is much controversy in the scientific publications regarding
correlation with specific emotions. Nevertheless, there is agreement that HRV pat-
terns change when compared to neutral state [10]. In a previous study we found inter-
esting differences in HRV patterns between different emotional states for musicians
69
performing the same musical piece [14]. BioControl
1
signal acquisition devices were
used to capture physiological signals, which were streamed wirelessly at 250 [Hz] to
signal recording computers, via the Wi-microDig
2
and Arduino
3
microcontrollers. In
order to assure synchronicity between physiological, visual and audio signals, every
sample of data was time stamped with a time code index which operated independ-
ently as part of a recording system protocol developed by the authors. The recorded
physiological signals were processed offline using MATLAB and the GSR signals
were low pass filtered (29th order FIR filter with 3 [Hz] cut-off frequency) and HRV
was extracted, using an algorithm created by the authors, that measured the RR inter-
val between beats (from the QRS waveform). In order to carry out a real-time evalua-
tion of the performance, signals were analyzed using a Max/MSP
4
patch that allowed
the visualization of all physiological signals, audio and video material simultaneously
(see Fig. 2 and [14]).
Fig. 2. Visualization tool created in Max/MSP to provide continuous analysis of the physio-
logical and audiovisual data. The figure shows 10 channels of ECG and GSR data (9 audience
members and 1 performer) plus the audiovisual recordings.
Due to the inherent problems of recording data in a live performance, the amount of
viable audience biosignals captured varied between performances (see Table 1). This
is principally caused by loss of signals due to movement artefacts and the non homo-
geneity of subject’s physiology (in a controlled lab set-up, the equipment could be
calibrated and adjusted to read biosignals of subjects with different ranges). For the
1
www.biocontrol.com [accessed 05 November, 2009]
2
www.infusionsystems.com [accessed 05 November, 2009]
3
www.arduino.cc [accessed 05 November, 2009]
4
http://www.cycling74.com/products/max5 [accessed 05 November, 2009]
70
preliminary results presented in the next section, a selection of the most significant
physiological reactions and correlation were chosen according to the observations
done with the Max/MSP patch (see Fig. 2).
Table 1. Detail of viable biosignals recorded during the experiments.
Performance Venue GSR HRV
Piano Improvisation Sonic Lab, Belfast 5 5
Stem Cells Sonic Lab, Belfast 5 7
Stem Cells School of Music, Durham 2 5
Imago School of Music, Durham 2 4
4 Preliminary Results
We will present a preliminary qualitative analysis of the data recorded, which at this
early stage cannot be considered as conclusive due to the limited sample size of the
study.
Fig. 3. Relative GSR levels of two selected audience members (bottom) and the performer
(top). The plot shows 5 minutes of Imago, with strong correlation between both audience
members and similarities with the changes in the performer during specific sections.
The continuous analysis of the GSR signals indicates a strong correlation with the
musical characteristics of the performance. In the piano improvisation piece, the per-
former made several gestures in anticipation of the note that was to be played, which
triggered increases in the GSR level of the audience. During the electroacoustic piece,
the composition was played with strong dynamic changes, with long crescendos and
71
sudden silences. This also resulted in significant changes at the GSR level (See Fig.
3). The most interesting results were observed when we overlapped the performer’s
GSR and HRV signals individually with each audience member. During certain pas-
sages of the musical pieces, there is strong correlation between the physiological
signals. Fig. 3 to Fig. 5 show examples where this phenomenon occurred.
Fig. 4. HRV for performer (top) and three selected audience members (bottom) during 2.5
minutes of Stem Cells in Durham. The plot shows a simultaneous increase and posterior de-
crease in HR for both performer and audience at 450 seconds approximately.
Fig. 5. GSR signals for performer and one selected audience member. The plot shows a strong
and continuous correlation during 5 minutes of Stem Cells in Durham.
72
5 Discussion and Conclusions
We have presented a novel approach to the study of physiological correlates of emo-
tion between performer and audience. Preliminary results indicate significant levels
of correlation, both for GSR and ECG signals. Yet, further studies are needed in order
to obtain conclusive results. The use of additional physiological features, such as
respiration rate and depth, has given interesting results previous studies [8] and is
suggested to be incorporated in future experiments.
The actual mechanisms by which emotional contagion occurs are still largely unde-
fined (some indicators may be found in [15] and [16]) but a theory which is currently
showing promise is that of ‘mirror’ neurons in the brain, which mimic externally
perceived actions or conditions with a corresponding impulse in a related part of the
observers brain e.g. seeing someone running causes neurons responsible for move-
ment to fire in the brain of the observer [17].
Auditory or visual cues are also likely to have an effect on a participant’s affective
state and there are indicators in our findings suggesting correlations between visually
led anticipation and changes in GSR. We have also found links between sudden or
extreme auditory events and physiological changes (some of which may be explained
by the ‘startle response’ [18 page 647]). Analysis of video recordings in conjunction
with the time-stamped biophysical data allows us to link specific auditory or visual
events with corresponding physiological changes and isolate periods in which there
are physiological changes in the absence of such cues.
One of the biggest problems in working in an ecological scenario such as a live con-
cert is the constraints imposed by time and the nature of an invited audience, which
reduces the option for calibration and changes of materials in case of any technical
problems. Nevertheless, we believe that methodologies as the one presented in this
study are an important step towards creating a more natural environment where ques-
tions addressing the complex relationship between music, emotion and physiology
are not affected by a laboratory set-up.
References
1. R.W. Picard, Affective Computing, MIT Press, 1997.
2. A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing Affective Dimensions from Body
Posture,” Lisbon, Portugal: Springer-Verlag, 2007, pp. 48-58.
3. A. Haag, S. Goronzy, P. Schaich, and J. Williams, “Emotion Recognition Using Bio-
sensors: First Steps towards an Automatic System,” Affective Dialogue Systems, 2004, pp.
36-48.
4. M. Scheutz, “Surviving in a Hostile Multi-agent Environment: How Simple Affective
States Can Aid in the Competition for Resources,” Advances in Artificial Intelligence,
2000, pp. 389-399.
5. R. Zeelenberg, E. Wagenmakers, and M. Rotteveel, “The impact of emotion on perception:
bias or enhanced processing?,” Psychological Science: A Journal of the American Psycho-
logical Society / APS, vol. 17, Apr. 2006, pp. 287-291.
6. J.A. Sloboda and P.N. Juslin, “Psychological perspectives on music and emotion. Music
and emotion: Theory and research,” Music and emotion: Theory and research, New York:
Oxford University Press, 2001, pp. 71-104.
73
7. J. Panksepp, “The emotional sources of "chills" induced by music.,” Music Perception,
vol. 13, Win. 1995, pp. 171-207.
8. J. Kim and E. André, “Emotion Recognition Based on Physiological Changes in Music
Listening,” IEEE transactions on pattern analysis and machine intelligence, vol. 30, 2008,
pp. 2067-2083.
9. P.N. Juslin and D. Västfjäll, “Emotional responses to music: the need to consider underly-
ing mechanisms,” The Behavioral and Brain Sciences, vol. 31, Oct. 2008, pp. 559-575;
discussion 575-621.
10. J.A. Etzel, E.L. Johnsen, J. Dickerson, D. Tranel, and R. Adolphs, “Cardiovascular and
respiratory responses during musical mood induction,” International Journal of Psychophy-
siology, vol. 61, Jul. 2006, pp. 57-69.
11. O. Grewe, R. Kopiez, and E. Altenmüller, “Chills as an indicator of individual emotional
peaks,” Annals of the New York Academy of Sciences, vol. 1169, Jul. 2009, pp. 351-354.
12. N. Coghlan and R.B. Knapp, “Sensory Chairs: A system for biosignal research and perfor-
mance,” Proceedings of the 8th International Conference on New Interfaces for Musical
Expression, Genova: 2008, pp. 233-6.
13. S.D. Kreibig, F.H. Wilhelm, W.T. Roth, and J.J. Gross, “Cardiovascular, electroder-
mal, and respiratory response patterns to fear- and sadness-inducing films,” Psychophysi-
ology, vol. 44, Sep. 2007, pp. 787-806.
14. J. Jaimovich and R.B. Knapp, “Pattern Recognition of Emotional States During Musical
Performance from Physiological Signals,” Proceedings of the 2009 International Computer
Music Conference, Montreal, Canada: 2009, pp. 461-4.
15. E. Hatfield, R. Rapson, and L. Le, “Primitive emotional contagion: Recent research,” The
Social Neuroscience of Empathy, The MIT Press, 2009.
16. R. Neumann and F. Strack, “"Mood contagion": the automatic transfer of mood between
persons,” Journal of Personality and Social Psychology, vol. 79, Aug. 2000, pp. 211-223.
17. Rizzolatti, G. and Craighero, L., “The mirror-neuron system,” Annual Review of Neuros-
cience, 2004, pp. 169-192.
18. J.T. Cacioppo, L.G. Tassinary, and G.G. Berntson, Handbook of Psychophysiology, Cam-
bridge University Press, 2007.
74