Predictor-based Control of Human Emotions When Reacting to a
Dynamic Virtual 3D Face Stimulus
Vytautas Kaminskas, Edgaras Ščiglinskas and Aušra Vidugirienė
Department of Systems’ Analysis, Vytautas Magnus University, Vileikos g. 8, Kaunas, Lithuania
Keywords: 3D Face, Human Emotions, Input-Output Model, Input Signal Boundaries, Predictor-based Control.
Abstract: This paper introduces how predictor-based control principles are applied to the control of human emotion –
excitement and frustration – signals. We use changing distance-between-eyes in a virtual 3D face as a
control signal. A predictor-based control law is synthesized by minimizing control quality criterion in an
admissible domain. Admissible domain is composed of input signal boundaries. Relatively high control
quality of excitement and frustration signals is demonstrated by modelling results. Input signal boundaries
allow decreasing variation of changes in a virtual 3D face.
1 INTRODUCTION
As virtual environment already became a part of our
daily life including computer games, learning
environments, social networks and their games,
there is a need to prevent children and adults from
harmful effects that can cause addiction to virtual
environment or even various mental diseases (Calvo
et al, 2015, Scherer et al, 2010). For this purpose, a
control mechanism for human state regulation is
needed. Brain-computer interfaces and applications
are one of the means that help to regulate human
state and emotions in different environments and
circumstances (Graimann et al, 2011, Tan and
Nijholt, 2010). We use EEG-based signals because
of their reliability and quick response (Sourina and
Liu, 2011; Hondrou and Caridakis, 2012).
We have investigated predictive input-output
structure models for exploring dependencies
between virtual 3D face features and human reaction
to them in Kaminskas et al. (2014), and Vaškevičius
et al. (2014) as a person is used to react quickly to
the smallest face feature changes (Willis and
Todorov, 2006). Predictive models are necessary in
the design of predictor-based control systems
(Åström and Wittenmark, 1997, Clarke, 1994,
Kaminskas, 2007)
This paper introduces how predictor-based
control principles are applied to the control of
human emotion signals (excitement and frustration).
We use changing distance-between-eyes in a virtual
3D face as a control signal.
2 INPUT-OUTPUT CONTROL
PLANT
A virtual 3D face with changing distance-between-
eyes was used for input as stimulus (shown in a
computer monitor to a volunteer) and EEG-based
pre-processed excitement and frustration signals of a
volunteer were measured as output (Figure 1). The
output signals were recorded with Emotiv Epoc
device that records EEG inputs from 14 channels
(according to international 10-20 locations): AF3,
F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8,
AF4 (Emotiv Epoc specifications). A dynamic
stimulus was formed from a changing woman face.
A 3D face created with Autodesk MAYA was used
as a “neutral” one (Figure 1, left).
Figure 1: Input-Output scheme.
Other 3D faces were formed by changing
distance-between-eyes in an extreme manner
(Figure 2). The transitions between normal and
extreme stages were programmed. “Neutral” face
has 0 value, largest distance-between-eyes
582
Kaminskas V., Š
ˇ
ciglinskas E. and Vidugiriene A..
Predictor-based Control of Human Emotions When Reacting to a Dynamic Virtual 3D Face Stimulus.
DOI: 10.5220/0005543005820587
In Proceedings of the 12th International Conference on Informatics in Control, Automation and Robotics (ICINCO-2015), pages 582-587
ISBN: 978-989-758-122-9
Copyright
c
2015 SCITEPRESS (Science and Technology Publications, Lda.)
corresponds to value 3 and smallest distance-
between-eyes corresponds to value -3.
Figure 2: A 3D virtual face with the smallest (left), normal
(middle) and the largest (right) distance-between-eyes.
Values of the output signals – excitement and
frustration – vary from 0 to 1. If excitement and
frustration are low, their values are close to 0 and if
they are high, their values are close to 1. The signals
were recorded with the sampling period of T
0
=0.5 s.
Dependency between virtual 3D face feature
(distance-between-eyes) and human emotions
(excitement or frustration) are described by input-
output structure linear model (Kaminskas et al,
2014):

=



(1)
where

=

,


=1

,

(2)
is an output (excitement or frustration),
is an
input (distance-between-eyes) signals respectively
expressed as
=

,
=

(3)
with sampling period
,
is a constant value,
corresponds to noise signal, and z
-1
is the backward-
shift operator (z

x
=x

).
Eq. (1) can be expressed in the following form:
=



−



,
(4)
Parameters (coefficients
and
, degrees m
and n of the polynomials (2) and constant
) of the
model (1) are unknown. They have to be estimated
according to the observations obtained during the
experiments with the volunteers (Kaminskas et al.,
2014).
3 DIGITAL PREDICTOR-BASED
CONTROL WITH
CONSTRAINTS
A predictor-based control law is synthesized by
minimizing control quality criterion

in an
admissible domain Ω
(Kaminskas, 2007):

:

→min

∈
(5)

=

−

(6)
Ω
=

:





,
|

−
|

(7)
where M is a mathematical expectation sign,

is
a reference signal (reference trajectory for emotion
signal),

and

are input signal boundaries
(smallest and largest distance-between-eyes),
0
are the restriction values for the change rate of the
input signal, and sign
||
denotes absolute value.
Then solving the minimization problem (5)-(7)
for one-step prediction model
|
=






(8)
the control law is described by equations

=
min
x

,

,

,
if


max
x

,
−
,

,
if


(9)


=−




−
,
(10)


=
1−


(11)
where z is a forward-shift operator (
=

).
If the roots of polynomial
=

(12)
are in the unity disk

1,
:=0,
=1,,,
(13)
then from (10) and (11) the following equation is
correct

=
1




−

−


.
(14)
If a part or all of polynomial (12) roots do not
belong to the unity disk, factorization of
Predictor-basedControlofHumanEmotionsWhenReactingtoaDynamicVirtual3DFaceStimulus
583
polynomial

is performed (Åström and
Wittenmark, 1997).
4 MODELLING RESULTS
Experiments consisted of two phases. In the first
phase human emotional signals (excitement and
frustration) as reactions to three types of dynamical
3D face stimuli (testing input) were observed.
According to these observations input-output model
(1) parameter estimates were calculated. Using
these estimates in the second phase, dynamical
virutal 3D face features were formed according to
control law (9)-(11) (control input). The control
tasks were to maintain high excitement levels and
low frustration levels (reference signals). In this case
control efficiency can be evaluated by a relative
measure:
∆=
|
−
|
100%
(15)
where
is an average of output
(excitement or
Figure 3: Excitement control (volunteer no. 1, female), when
=1/(left) and
=6/(right). Top: solid lines denote
reference signals
, dotted lines denote output
(excitement), and dashed lines denote output
; middle: control input
(distance-between-eyes); bottom: testing input
(distance-between-eyes).
Figure 4: Excitement control (volunteer no. 3, male), when
=1/(left) and
=6/(right). Top: solid lines denote
reference signals
, dotted lines denote output
(excitement), and dashed lines denote output
; middle: control input
(distance-between-eyes); bottom: testing input
(distance-between-eyes).
0 20 40 60 80 100
0
0.5
1
Time, s
Output
0 20 40 60 80 100
-2
0
2
Control input
Time, s
0 20 40 60 80 100
-2
0
2
Testing input
Time, s
0 20 40 60 80 100
0
0.5
1
Time, s
Output
0 20 40 60 80 100
-2
0
2
Cntrol input
Time, s
0 20 40 60 80 100
-2
0
2
Testing input
Time, s
0 20 40 60 80
0
0.5
1
Time, s
Output
0 20 40 60 80
-2
0
2
Control inpu
Time, s
0 20 40 60 80
-2
0
2
Testing input
Time
,
s
0 20 40 60 80
0
0.5
1
Time, s
Output
0 20 40 60 80
-2
0
2
Control input
Time, s
0 20 40 60 80
-2
0
2
Testing input
Time, s
ICINCO2015-12thInternationalConferenceonInformaticsinControl,AutomationandRobotics
584
Figure 5: Frustration control (volunteer no. 4, female), when
=1/(left) and
=6/(right). Top: solid lines denote
reference signals
, dotted lines denote output
(frustration), and dashed lines denote output
; middle: control input
(distance-between-eyes); bottom: testing input
(distance-between-eyes).
Figure 6: Frustration control (volunteer no. 5, male), when
=1/(left) and
=6/(right). Top: solid lines denote
reference signals
, dotted lines denote output
(frustration), and dashed lines denote output
; middle: control input
(distance-between-eyes); bottom: testing input
(distance-between-eyes).
frustration) as a reaction to testing input, and
is
an average of output
(excitement or frustration)
as a reaction to control input. These measures are
given in Table1 and Table2.
Table 1: Efficiency of excitment control.
Volunteer no.
∆,%
=1/
=6/
1 (female) 119.9 133.6
2 (male) 90.1 103.6
3 (male) 205.6 205.5
Table 2: Efficiency of frustration control.
Volunteer no.
∆,%
=1/
=6/
1 (female) 35.8 35.1
4 (female) 39.0 36.6
5 (male) 40.3 40.8
6 (male) 27.4 30.4
Excitement and frustration control results are
shown in Figs. 3-7.
0 10 20 30 40 50
0
0.5
1
Time, s
Output
0 10 20 30 40 50
-2
0
2
Control input
Time, s
0 10 20 30 40 50
-2
0
2
Testing input
Time
,
s
0 10 20 30 40 50
0
0.5
1
Time, s
Output
0 10 20 30 40 50
-2
0
2
Control input
Time, s
0 10 20 30 40 50
-2
0
2
Testing input
Time, s
0 10 20 30 40 50
0
0.5
1
Time, s
Output
0 10 20 30 40 50
-2
0
2
Control input
Time, s
0 10 20 30 40 50
-2
0
2
Testing input
Time, s
0 10 20 30 40 50
0
0.5
1
Time, s
Output
0 10 20 30 40 50
-2
0
2
Control input
Time, s
0 10 20 30 40 50
-2
0
2
Testing input
Time, s
Predictor-basedControlofHumanEmotionsWhenReactingtoaDynamicVirtual3DFaceStimulus
585
Figure 7: Frustration control (volunteer no. 6, male), when
=1/(left) and
=6/(right). Top: solid lines denote
reference signals
, dotted lines denote output
(frustration), and dashed lines denote output
; middle: control input
(distance-between-eyes); bottom: testing input
(distance-between-eyes).
Modelling results show that using predictor-based
control with constraints a sufficiently good quality
of human emotional signals control can be reached.
Excitement level can be raised up to 2 times in
comparison with testing input, and frustration level
can be lowered by 1/3 in comparison with testing
input. Control quality is influenced by a control
signal variation speed which is limited by the
parameter
of the admissible domain. This
parameter allows decreasing control signal variation
which is usually high in predictor-based control
systems without constraints.
5 CONCLUSIONS
Predictor-based control with constraints was applied
for controlling human emotions (excitement and
frustration) when reacting to a dynamic stimulus
(virtual 3D face with changing distance-between-
eyes).
Sufficiently good control quality of excitement
and frustration signals is demonstrated by modelling
results. Input signal boundaries allow decreasing
variation of changes in a virtual 3D face.
ACKNOWLEDGEMENTS
Postdoctoral fellowship of Ausra Vidugiriene was
funded by European Union Structural Funds project
”Postdoctoral Fellowship Implementation in
Lithuania” within the framework of the Measure for
Enhancing Mobility of Scholars and Other
Researchers and the Promotion of Student Research
(VP1-3.1-ŠMM-01) of the Program of Human
Resources Development Action Plan.
REFERENCES
Åström, K.J., and Wittenmark, B., 1997. Computer
Controlled Systems – Theory and Design. 3
rd
ed.
Prentice Hall Inc.
Calvo, R.A., D’Mello, S.K., Gratch, J., Kappas, A.,
(editors), 2015. The Oxford Handbook of Affective
Computing. Oxford library of psychology. Oxford
University Press, 2015.
Clarke, D.W., 1994. Advances in Model Predictive
Control. Oxford Science Publications, UK, 1994.
Hondrou, C., Caridakis, G., 2012. Affective, Natural
Interaction Using EEG: Sensors, Application and
Future Directions. In Artificial Intelligence: Theories
and Applications, Vol. 7297, p. 331-338. Springer-
Verlag Berlin Heidelberg.
Emotiv Epoc specifications. Brain-computer interface
technology. Available at: http://www.emotiv.com/
upload/manual/sdk/EPOCSpecifications.pdf.
Graimann, B., Allison, B., Pfurtscheller, G., (editors),
2011. Brain-computer interfaces. Revolutionizing
human-computer interaction. The Frontiers
Collection. Springer-Verlag Berlin Heidelberg, 2011.
Kaminskas, V., 2007. Predictor-based self tuning control
with constraints. In: Model and Algorithms for Global
Optimization, Optimization and Its Applications Vol.
4, Springer, p. 333-341.
0 20 40 60 80
0
0.5
1
Time, s
Output
0 20 40 60 80
-2
0
2
Control input
Time, s
0 20 40 60 80
-2
0
2
Testing input
Time, s
0 20 40 60 80
0
0.5
1
Time, s
Output
0 20 40 60 80
-2
0
2
Control input
Time, s
0 20 40 60 80
-2
0
2
Testing Input
Time, s
ICINCO2015-12thInternationalConferenceonInformaticsinControl,AutomationandRobotics
586
Kaminskas, V., Vaškevičius, E., Vidugirienė, A., 2014.
Modeling Human Emotions as Reactions to a
Dynamical Virtual 3D Face. Vilnius University,
INFORMATICA, 2014, Vol. 25, No. 3, p. 425–437.
Scherer, K.R., Bänziger, T., Roesch, E.B., (editors), 2010.
Blueprint for Affective Computing, a sourcebook.
Series in Affective Science. Oxford university press,
2010.
Sourina, O., Liu, Y., 2011. A Fractal-based Algorithm of
Emotion Recognition from EEG using Arousal-
valence model. In Proc. Biosignals, p. 209-214.
Tan, D.S., Nijholt, A., (editors), 2010. Brain-computer
interfaces. Applying our minds to human-computer
interaction, Human-computer interaction series.
Springer-Verlag Berlin Heidelberg, 2010.
Vaškevičius, E., Vidugirienė, A., Kaminskas, V., 2014.
Identification of Human Response to Virtual 3D Face
Stimuli. Information Technologies and Control, Vol.
43, No. 1. p. 47 – 56.
Willis, J., and Todorov, A., 2006. First Impressions:
Making Up Your Mind After a 100-Ms Exposure to a
Face. Psychological science, Vol.17, No.7. 2006.
p.592-598.
Predictor-basedControlofHumanEmotionsWhenReactingtoaDynamicVirtual3DFaceStimulus
587