Cowley, B., Ravaja, N. & Heikura, T. (2013). Cardiovascular
physiology predicts learning effects in a serious game
activity. In: Computers & Education, 60(1), 299–309.
D’Mello, S. K., Craig, S. D., Sullins, J., & Graesser, A. C.
(2006). Predicting affective states expressed through an
emote-aloud procedure from AutoTutor’s mixed-
initiative dialogue. In: International Journal of Artificial
Intelligence in Education, 16(1), 3-28.
Devillers, L. & Vidrascu, L. (2007). Real-life emotion
recognition in speech. In: Speaker Classification II. pp.
34–42. Springer, Berlin, Heidelberg.
D’Mello, S. K., Craig, S. D., Witherspoon, A., Mcdaniel, B.,
& Graesser, A. (2008). Automatic detection of learner’s
affect from conversational cues. In: User Modeling and
User-adapted interaction, 18(1-2), 45-80.
D’Mello, S. & Graesser, A. (2012). Dynamics of affective
states during complex learning. In: Learning and
Instruction, 22(2), 145–157.
D’Mello, S. (2013). A selective meta-analysis on the relative
incidence of discrete affective states during learning with
technology. In: Journal of Educational Psychology,
105(4), 1082.
D’Mello, S. & Mills, C. (2014). Emotions while writing
about emotional and non-emotional topics. In:
Motivation and Emotion, 38(1), 140-156.
EyeBlinkRight, Apple developer documentation,
https://developer.apple.com/documentation/arkit/arfacea
nchor/blendshapelocation/2928262-eyeblinkright.
Forbes-Riley, K. & Litman, D. (2011, June). When does
disengagement correlate with learning in spoken dialog
computer tutoring? In: International Conference on
Artificial Intelligence in Education. pp. 81–89. Springer,
Berlin, Heidelberg.
Feidakis, M., Daradoumis, T., & Caballé, S. (2013, January).
Building emotion-aware features in computer supported
collaborative learning (CSCL) systems. In: Alpine
Rendez-Vous (ARV) Workshop on Tools and
Technologies for Emotion Awareness in Computer-
Mediated Collaboration and Learning (ARV 2013).
Gomes, J., Yassine, M., Worsley, M., & Blikstein, P. (2013,
July). Analysing engineering expertise of high school
students using eye tracking and multimodal learning
analytics. In: Educational Data Mining.
Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N.,
& Lester, J. C. (2013, July). Embodied affect in tutorial
dialogue: student gesture and posture. In: International
Conference on Artificial Intelligence in Education. pp. 1–
10. Springer, Berlin, Heidelberg.
Grafsgaard, J., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., &
Lester, J. (2013, July). Automatically recognizing facial
expression: Predicting engagement and frustration. In:
Educational Data Mining.
Hussain, M. S., AlZoubi, O., Calvo, R. A., & D’Mello, S. K.
(2011, June). Affect detection from multichannel
physiology during learning sessions with AutoTutor. In:
International Conference on Artificial Intelligence in
Education. pp. 131–138. Springer, Berlin, Heidelberg.
Kapoor, A. & Picard, R. W. (2005). Multimodal affect
recognition in learning environments, In: Proceedings of
the 13th Annual ACM International Conference on
Multimedia, pp. 677–682.
Koning, B. B. de, Tabbers, H. K., Rikers, R. M., & Paas, F.
(2010). Attention guidance in learning from a complex
animation: Seeing is understanding?, In: Learning and
Instruction
, 20(2), 111–122.
Landis, J. R. & Koch, G. G. (1977). The measurement of
observer agreement for categorical data. In: Biometrics,
159-174.
Li, Y., Su, H., Shen, X., Li, W., Cao, Z., & Niu, S. (2017).
DailyDialog: A manually labelled multi-turn dialogue
dataset. In: arXiv preprint arXiv: pp. 1710-03957.
Luft, C. D., Nolte, G., & Bhattacharya, J. (2013). High-
learners present larger mid-frontal theta power and
connectivity in response to incorrect performance
feedback. In: The Journal of Neuroscience, 33(5), 2029–
2038.
O'Brien, H. L. & Toms, E. G. (2010). The development and
evaluation of a survey to measure user engagement. In:
Journal of the American Society for Information Science
and Technology, 61(1), 50-69.
Pardos, Z. A., Baker, R. S., San Pedro, M. O., Gowda, S. M.,
& Gowda, S. M. (2014). Affective states and state tests:
Investigating how affect and engagement during the
school year predict end-of-year learning outcomes. In:
Journal of Learning Analytics, 1(1), 107-128.
Parsons, J. & Taylor, L. (2012). Student engagement: What
do we know and what should we do? In: University of
Alberta.
Poria, S., Majumder, N., Mihalcea, R., & Hovy, E. (2019).
Emotion recognition in conversation: Research
challenges, datasets, and recent advances. In: IEEE
Access, 7, 100943-100953.
Peng, S., Ohira, S., Nagao, K. (2019). Prediction of Students’
Answer Relevance in Discussion Based on their Heart-
Rate Data, In: International Journal of Innovation and
Research in Educational Sciences (IJIRES), 6(3), 414-
424.
Peng, S., Ohira, S., Nagao, K. (2019). Automatic Evaluation
of Students’ Discussion Skills Based on their Heart Rate.
In: Computer Supported Education, 1022, 572-585,
Springer.
Rodrigo, M. M. T., Baker, R. S., Agapito, J., Nabos, J.,
Repalam, M. C., Reyes, S. S., & San Pedro, M. O. C.
(2012). The effects of an interactive software agent on
student affective dynamics while using; an intelligent
tutoring system. In: IEEE Transactions on Affective
Computing, 3(2), 224-236.
Shepard, L. A. (2005). Linking formative assessment to
scaffolding. In: Educational Leadership, 63(3), 66-70.
Stevens, R. H., Galloway, T., & Berka, C. (2007, July). EEG-
related changes in cognitive workload, engagement and
distraction as students acquire problem solving skills. In:
International Conference on User Modeling. pp. 187–
196. Springer, Berlin, Heidelberg.
Whitehill, J., Serpell, Z., Foster, A., Lin, Y. C., Pearson, B.,
Bartlett, M., & Movellan, J. (2011, June). Towards an
optimal affect-sensitive instructional system of cognitive
skills. In: CVPR 2011 Workshops. pp. 20–25. IEEE.