Table 1: Mean Velocities Over 81 Edge Segments.
Emotion
Mean
Avatar
Mean
Em.M.
Abs.
Diff.
S.D.
Avatar
SD
Em.M.
Anger -.0003 -.0009 .0006 .0027 .0017
Fear -.0003 -.0008 .0005 .0027 .0017
Sadness .0014 -.0002 .0016 .0090 .0058
Surprise -.0003 -.0008 .0005 .0027 .0017
5 CONCLUSION
Locating the mean of the Avatar velocities for four
emotion labels within ± 1 Standard Deviation
suggests that the Avatar’s velocity behavior is within
at least 68% of the behavior of the Emotion Model’s
behavior in response to the same stimuli. This fact
alone must be bolstered by noting that the Variance
for each of the histograms is 1 to 2 magnitudes less
than the Standard Deviation. With such a narrow
variance, we conclude that the emotion velocities of
all four emotion labels are substantially resemblant.
The contribution or our proposed method is to
create a ground truth reference for one single subject.
The experiments of this research accept the
assumption that the corpora used to validate the NNs
embedded in the FER software are sufficient to build
a secondary corpus like the one we propose, designed
to simulate one actor’s character rather than to
recognize the facial emotion of a wide set of generic
human faces. The approach of this research intends to
streamline character animation in video game
production by leveraging the work of human
annotators used in the development of FERs. By
using programable statistical techniques as applied in
this research, a more automatic process of evaluating
facial emotion corpora could accelerate the use of
emotion AI in NPCs for future game production.
REFERENCES
Bänziger, T.; Mortillaro, M.; and Scherer, K.R., (2011).
Introducing the Geneva Multimodal Expression Corpus
for Experimental Re-search on Emotion Perception. In
Emotion. vol. 12, no. 5. American Psychological
Association, New York, NY, USA.
Bates, J., Loyall, A. B., and Reilly, W., (1994). An
architecture for action, emotion, and social behavior, In
Artificial Social Systems: Fourth European Workshop
on Modeling Autonomous Agents in a Multi-Agent
World, Springer-Verlag, Berlin
Barros, P.; Churamani, N., Lakomkin, E.; Siquiera, H.,
Sutherland, A.; and Wermter. S. (2018). The OMG-
Emotion Behavior Dataset. In Proceedings of the
International Joint Conference on Neural Networks
(IJCNN).
Benda, M. S.; Scherf, K. S. (2020). The Complex Emotion
Expression Database: A validated stimulus set of
trained actors. In PloS One, 15(2), e0228248
Busso, C.; Burmania, A.; Sadoughi. N. (2017). MSP-
IMPROV: An Acted Corpus of Dyadic Interactions to
Study Emotion Perception. In Transactions on
Affective Computing, vol.10, no. 10. New York: IEEE
Calvo, M. G., Fernández-Martín, A., Recio, G. and
Lundqvist, D. (2018). Human observers and automated
assessment of dynamic emotional facial expressions:
KDEF-dyn database validation. In Frontiers in
Psychology.
Cohn, J., Ambadar, Z., Ekman, P. (2007) Observer-Based
Measurement of Facial Expression with the Facial
Action Coding System, in Handbook of emotion
elicitation and assessment, eds. Coan, J. A., and Allen,
J. B., Oxford University Press.
Darwin, C., and Prodger, P. (1872/1998). The expression of
the emotions in man and animals. Oxford University
Press, USA.
Ekman, P., (Ed.). (2006). Darwin and facial expression: A
century of research in review. Cambridge, MA: Malor
Books, Institute for the Study of Human Knowledge.
Ekman, P., Friesen, W. V., O'Sullivan, M., Chan, A.,
Diacoyanni-Tarlatzis, I., Heider, K., Krause, R.,
LeCompte, W. A., Pitcairn, T., Ricci-Bitti, P. E.,
Scherer, K., Tomita, M., and Tzavaras, A. (1987)
Universals and cultural differences in the judgments of
facial expressions of emotion. Journal of Personality
and Social Psychology, 53(4), 712–717
Khorrami, P., Le Paine, T., Brady, K., Dagli, C. and Huang,
T.S., (2016). How deep neural networks can improve
emotion recognition on video data, in IEEE
international conference on image processing (ICIP),
New York, NY, USA: IEEE.
Kozasa, C, Fukutake, H., Notsu, H., Okada, Y., and
Niijima, K., (2006) Facial animation using emotional
model, International Conference on Computer
Graphics, Imaging and Visualization (CGIV'06).
Krumhuber, E. G., Kappas, A., and Manstead, A. S. (2013).
Effects of dynamic aspects of facial expressions: A
review. Emotion Review, 5(1).
Krumhuber, E. G.; Skora, L.; Küster, D.; Fou, L. (2017). A
review of dynamic datasets for facial expression
research. In Emotion Re-view, 9(3).
Lewinski, P., Den Uyl, T. M., and Butler, C. (2014).
Automated Facial Coding: Validation of Basic
Emotions and FACS AUs in FaceReader. Journal of
Neuroscience, Psychology, and Economics 7.4.
Lim, M.Y., Dias, J., Aylett, R., Paiva, A., “Creating
adaptive affective autonomous NPCs,” Autonomous
Agents and Multi-Agent Systems, 2012, New York,
NY, USA: Springer.
Loyall, A. B., “Believable agents: Building interactive
personalities” Carnegie-Mellon University, Pittsburgh,
PA Department of Computer Science, (1997) accessed
30 June 2021 at: https://www.cs.cmu.edu/afs/cs
/project/oz/web/papers/CMU-CS-97-123.pdf