Reliability for the video stimulation calculated
by tes-retest reliability for 22 participants two
months after the experiment. The result shown 0.684
for the test-retest reliability. This result indicated a
low reliability for the response of video stimulation.
Mean and standard deviation of first measurement
were 8.051.32 and second measurement were
7.451.67 These result implicated a learning bias for
the video stimulation.
4 DISCUSSION
Our result indicated AU6 and AU 12 did not support
the hypothesis that happy emotion can be detected
from facial expression. Facial emotion expression
was formed by neck and muscular movement which
is influence by sympathetic and parasympathetic
nervous system. Thus psychological state in our
central and peripheral nervous system influence the
facial expression (Meier et al., 2016).
We found negative findings or low differentiated
power for AU6 and AU12 to dissociate happy facial
expression. Two main factors influenced this result.
Firstly, The Individual differences in stimuli
interpretation and individual facial emotion
expression. (Maoz et al., 2016). However, at trend
level we found the higher score of happiness of the
participant the higher power of area under ROC for
AU6 and AU12 (Table 1).
Secondly, the co-occurrence of other emotion in
the same time during the experiment procedure
influenced the AU12 muscle of zygomaticus major.
This study did not control the other emotions during
in stimuli presentation.
The low power to dissociate between the
conditions also occurred from the reliability and
sensitivity of the computer software. (Menzel,
Redies, & Hayn-Leichsenring, 2018). Validity and
reliability by replication study and by comparison
with human observer are needed.
Theoretical implication of this finding suggested
evaluation of psychological emotion measurement
from facial muscular emotion or facial action unit.
Happiness may a complex emotions and should be
detect with multi action unit and multi-modality
parameter (head pose, gaze or other action unit).
Practical implication of our finding supported the
use of development the use of FACS in human
emotion detection for a real time detection system of
human emotion.
5 CONCLUSION
Our research finding shown lack of validation
measurement using Facial Action Unit. Our study is
limited to use AU6 and AU12 due to Ekman’s
FACS. Other modality measurement of gaze and
head posed and other facial action unit may involve
in facial happy emotion. Due to limitation of
reliability of the video stimulation, experimental
design and variation facial expression, further
replication studies on validation facial action unit to
measure emotion especially happiness is worth to
held.
REFERENCES
Amos, B. Ludwiczuk., M. Satyanarayanan, 2016
"Openface: A general-purpose face applications,"
CMU-CS-16-118, CMU School of Computer
Science, Tech. Rep., 2016.
Tadas B., Amir, Z., Yao C. L., and Louis-Philippe,
M. 2018, OpenFace 2.0: Facial Behavior
Analysis Toolkit in IEEE International
Conference on Automatic Face and Gesture
Recognition, 2018
Erkoç, T., Ağdoğan, D., & Eskil, M. T. 2018. An
observation based muscle model for simulation
of facial expressions. Signal Processing: Image
Communication, 64, 11-20.
Huang, J., Wang, Y., Jin, Z., Di, X., Yang, T., Gur,
RC., Chan, R.C.K. 2013. Happy facial
expression processing with different social
interaction cues: An fMRI study of individuals
with schizotypal personality traits. Progress in
Neuro-Psychopharmacology and Biological
Psychiatry, 44, 108-117.
Kerestes, R., Segreti, A.M., Pan, L.A., Phillips,
M.L., Birmaher, B., Brent, D.A., & Ladouceur,
C.D. 2016. Altered neural function to happy
faces in adolescents with and at risk for
depression. Journal of Affective Disorders, 192,
143-152.
Kotsia, I., Zafeiriou, S., & Pitas, I. 2008. Texture
and shape information fusion for facial
expression and facial action unit recognition.
Pattern Recognition, 41(3), 833-851.
Liong, S-T., See, J., Wong, K., & Phan, R.C. W.
2018. Less is more: Micro-expression
recognition from video using apex frame. Signal
Processing: Image Communication, 62, 82-92.
Maoz, K., Eldar, S., Stoddard, J., Pine, D.S.,
Leibenluft, E., & Bar-Haim, Y. 2016. Angry-