These findings have several applications. The
understanding of facial expressions has application in
mental health setting where it can help identify
mental state, intensity of pain, deception of
symptoms, subjective experience of treatment/
interventions, automated counselling, and many more
areas. Such findings are also likely to affect human
computer interaction (HCI), interactive video, and
other related areas. Calder et al. (2001) have
classified emotion expression into three categories
and the take away for HCI research. Happiness and
surprise can be detected easily irrespective of the
distance between the expressor and the person
perceiving it. Anger and sadness are reasonably
detected from proximity. Fear and disgust constitute
the third group of emotions for which people are not
very good at recognizing. Although, we also found
the relationship between happiness and surprise, our
findings show little deviation from the findings of
Calder et al. (2001). These findings might be useful
for HCI researchers looking for systems that can at
least reasonably imitate human perceptual ability.
Some researchers suggest variability in the perception
of dynamic expressions in the clinical population
such as Pervasive Developmental Disorder (Uono,
Sato, & Toichi, 2010) and Asperger Syndrome
(Kätsyri et al., 2008). The stimulus used in the present
study has graded intensity level adding to the
dynamic nature of facial expression and thus might be
useful for study of the clinical population as well.
The advantage of the two databases analyzed in
this work is that they contain static stimuli extracted
from dynamic source that represents real life
condition. Thus, together they consist of facial
expression of emotions of all the six basic emotions
of six varying intensities and five different viewing
angles. However, there is an inherent limitation as
well. While IDBE consists of facial expressions of
only one male expresser, IAPD comprise of
expressions from five different viewing angles but not
of variable intensity. Although, the absence of larger
database limits the generalizability of specific
findings but it does establish that RMS and fractal
dimension can be very well applied in behavioural
science studies as well.
REFERENCES
Athe, P., Shakya, S., Munshi, P., Luke, A., & Mewes, D.
(2013). Characterization ofmultiphase flow in bubble
columns using KT-1 signature and fractal dimension.
Flow Measurement and Instrumentation, 33, 122-137.
Bhatt, V. Munshi, P., & Bhattacharjee, J. K. (1991).
Application of fractal dimension for nondestructive
testing. Materials Evaluation, 49, 1414-1418.
Bhushan, B. (2007). Subjective analysis of facial
expressions: Inputs from behavioural research for
automated systems. Unpublished project report INI-
IITK-20060049, Indian Institute of Technology,
Kanpur.
Bhushan, B. (2015). Study of facial micro-expressions in
psychology. In A. Awasthi & M. K. Mandal (Eds.)
Understanding facial expressions in communication:
Cross-cultural and multidisciplinary perspective.
Springer, pp. 265-286. https://doi.org/10.1007/978-81-
322-1934-7_13
Bould, E. & Morris, N. (2008). Role of motion signals in
recognizing subtle facial expressions of emotion.
British Journal of Psychology. 99, 167-189.
https://doi.org/10.1348/000712607X206702
Calder, A. J., Burton, A. M., Miller, P., Young, A. W., &
Akamatsu, S. (2001). A principal component analysis
of facial expressions. Vision Research, 41, 1179-1208.
https://doi.org/10.1016/S0042-6989(01)00002-5
Calvo, M. G. & Lundqvist, D. (2008). Facial expressions of
emotion (KDEF): Identification under different
display-duration conditions. Behavior Research
Methods, 40, 109-115.
https://doi.org/10.3758/BRM.40.1.109
Calvo, M. G., Gutiérrez-García, A., Fernández-Martín, A.,
&Nummenmaa, L. (2014). Recognition of facial
expressions of emotion is related to their frequency in
everyday life. Journal of Nonverbal Behavior, 38, 549-
567. https://doi.org/10.1007/s10919-014-0191-3
Du, S. & Martinez, A. M. (2011). The resolution of facial
expressions of emotion. Journal of Vision, 11, 24.
https://doi.org/10.1167/11.13.24
Harms, M. B, Martin, A., & Wallace, G. L. (2010). Facial
emotion recognition in autism spectrum disorders: a
review of behavioral and neuroimaging studies.
Neuropsychology Review, 20, 290-322.
https://doi.org/10.1007/s11065-010-9138-6
Hess, U., Adams, R. B., &Kleck, R. E. (2009). The face is
not an empty canvas: how facial expressions interact
with facial appearance. Philosophical Transactions of
the Royal Society B: Biological Sciences, 364, 3497-
3504. https://doi.org/10.1098/rstb.2009.0165
Kätsyri, J., Saalasti, S., Tiippana, K., von Wendt, L.,
&Sams, M. (2008). Impaired recognition of facial
emotions from low-spatial frequencies in Asperger
syndrome. Neuropsychologia, 46, 1888-1897.
https://doi.org/10.1016/j.neuropsychologia.2008.01.00
5
Lander, K. & Butcher, N. (2015). Independence of face
identity and expression processing: exploring the role
of motion. Frontiers in Psychology, 6, 255.
https://doi.org/10.3389/fpsyg.2015.00255
Leppänen, J. M. &Hietanen, J. K. (2004). Positive facial
expressions are recognized faster than negative facial
expressions, but why? Psychological Research, 69, 22-
29. https://doi.org/10.1007/s00426-003-0157-2