
only eye tracking glasses. Journal of Computational
Design and Engineering, 7(2):228–237. Number: 2.
Li, X., C¸
¨
oltekin, A., and Kraak, M.-J. (2010). Visual Explo-
ration of Eye Movement Data Using the Space-Time-
Cube. In Fabrikant, S. I., Reichenbacher, T., van Krev-
eld, M., and Schlieder, C., editors, Geographic Infor-
mation Science, pages 295–309. Springer.
Liu, M., Li, Y., and Liu, H. (2020). 3D Gaze Estimation
for Head-Mounted Eye Tracking System With Auto-
Calibration Method. IEEE Access, 8:104207–104215.
Llanes-Jurado, J., Mar
´
ın-Morales, J., Guixeres, J., and
Alca
˜
niz, M. (2020). Development and Calibration of
an Eye-Tracking Fixation Identification Algorithm for
Immersive Virtual Reality. Sensors, 20(17):4956.
Mizuchi, Y. and Inamura, T. (2018). Evaluation of Human
Behavior Difference with Restricted Field of View in
Real and VR Environments. In IEEE Int. Symp. on
Robot and Human Interactive Communication (RO-
MAN), pages 196–201.
Onkhar, V., Dodou, D., and de Winter, J. C. F. (2023). Eval-
uating the Tobii Pro Glasses 2 and 3 in static and dy-
namic conditions. Behavior Research Methods.
Paletta, L., Santner, K., Fritz, G., Mayer, H., and Schram-
mel, J. (2013). 3D attention: measurement of visual
saliency using eye tracking glasses. In ACM CHI EA
’13 Extended Abstracts on Human Factors in Comput-
ing Systems, pages 199–204.
Pathmanathan, N.,
¨
Oney, S., Becher, M., Sedlmair, M.,
Weiskopf, D., and Kurzhals, K. (2023). Been There,
Seen That: Visualization of Movement and 3D Eye
Tracking Data from Real-World Environments. Com-
puter Graphics Forum, 42(3):385–396.
Pfeiffer, T. (2012). Measuring and visualizing attention in
space with 3D attention volumes. In Proc. of the ACM
Symp. on ETRA, pages 29–36.
Pfeiffer, T. and Memili, C. (2016). Model-based real-time
visualization of realistic three-dimensional heat maps
for mobile eye tracking and eye tracking in virtual re-
ality. In Proc. of the ACM Symp. on ETRA, pages 95–
102.
Pfeiffer, T., Renner, P., and Pfeiffer-Leßmann, N. (2016).
EyeSee3D 2.0: model-based real-time analysis of
mobile eye-tracking in static and dynamic three-
dimensional scenes. In Proc. of the ACM Symp. on
ETRA, pages 189–196.
Pfeil, K., Taranta, E. M., Kulshreshth, A., Wisniewski, P.,
and LaViola, J. J. (2018). A comparison of eye-head
coordination between virtual and physical realities. In
Proc. ACM Symp. on Applied Perception, pages 1–7.
Pomplun, M. and Sunkara, S. (2019). Pupil dilation as an
indicator of cognitive workload in human-computer
interaction. In Human-Centered Computing, pages
542–546. CRC Press.
Reipschl
¨
ager, P., Brudy, F., Dachselt, R., Matejka, J., Fitz-
maurice, G., and Anderson, F. (2022). AvatAR: An
Immersive Analysis Environment for Human Motion
Data Combining Interactive 3D Avatars and Trajecto-
ries. In Proc. of ACM CHI, pages 1–15.
Salvucci, D. and Goldberg, J. (2000). Identifying fixations
and saccades in eye-tracking protocols. In Proc. of
ACM Symp. ETRA, pages 71–78.
Startsev, M., Agtzidis, I., and Dorr, M. (2019). 1D CNN
with BLSTM for automated classification of fixations,
saccades, and smooth pursuits. Behavior Research
Methods, 51(2):556–572.
Stellmach, S., Nacke, L., and Dachselt, R. (2010). Ad-
vanced gaze visualizations for three-dimensional vir-
tual environments. In Proc. of ACM Symp. ETRA,
pages 109–112.
Sundstedt, V. and Garro, V. (2022). A Systematic Review of
Visualization Techniques and Analysis Tools for Eye-
Tracking in 3D Environments. Frontiers in Neuroer-
gonomics, 3.
Takahashi, N., Inamura, T., Mizuchi, Y., and Choi, Y.
(2021). Evaluation of the difference of human behav-
ior between vr and real environments in searching and
manipulating objects in a domestic environment. In
Proc. IEEE Int. Conf. on Robot & Human Interactive
Communication (RO-MAN), pages 454–460.
Takahashi, R., Suzuki, H., Chew, J. Y., Ohtake, Y., Na-
gai, Y., and Ohtomi, K. (2018). A system for three-
dimensional gaze fixation analysis using eye tracking
glasses. Journal of Computational Design and Engi-
neering, 5(4):449–457.
Ugwitz, P., Kvarda, O., Ju
ˇ
r
´
ıkov
´
a, Z.,
ˇ
Sa
ˇ
sinka,
ˇ
C., and
Tamm, S. (2022). Eye-Tracking in Interactive Virtual
Environments: Implementation and Evaluation. Ap-
plied Sciences, 12(3):1027.
Villenave, S., Cabezas, J., Baert, P., Dupont, F., and Lavou
´
e,
G. (2022). XREcho: a unity plug-in to record and
visualize user behavior during XR sessions. In Proc.
of ACM Multimedia Systems Conf., pages 341–346.
Wang, X., Lindlbauer, D., Lessig, C., and Alexa, M. (2017).
Accuracy of Monocular Gaze Tracking on 3D Geom-
etry. In Eye Tracking and Visualization, pages 169–
184. Springer: Mathematics and Visualization.
Wang, X., Lindlbauer, D., Lessig, C., Maertens, M., and
Alexa, M. (2016). Measuring the Visual Salience of
3D Printed Objects. IEEE Computer Graphics and
Applications, 36(4):46–55.
Willemsen, P., Colton, M. B., Creem-Regehr, S. H., and
Thompson, W. B. (2009). The effects of head-
mounted display mechanical properties and field of
view on distance judgments in virtual environments.
ACM Transactions on Applied Perception, 6(2):8:1–
8:14.
Zemblys, R., Niehorster, D. C., Komogortsev, O., and
Holmqvist, K. (2018). Using machine learning to de-
tect events in eye-tracking data. Behavior Research
Methods, 50(1):160–181.
Zhang, L., Wade, J., Swanson, A., Weitlauf, A., Warren, Z.,
and Sarkar, N. (2015). Cognitive state measurement
from eye gaze analysis in an intelligent virtual real-
ity driving system for autism intervention. In 2015
international conference on affective computing and
intelligent interaction (ACII), pages 532–538. IEEE.
GRAPP 2025 - 20th International Conference on Computer Graphics Theory and Applications
222