Fuhl, W. (2021a). Maximum and leaky maximum propaga-
tion. arXiv preprint arXiv:2105.10277.
Fuhl, W. (2021b). Tensor normalization and full distribution
training. arXiv preprint arXiv:2109.02345.
Fuhl, W., Castner, N., and Kasneci, E. (2018a). Histogram
of oriented velocities for eye movement detection. In
International Conference on Multimodal Interaction
Workshops, ICMIW.
Fuhl, W., Castner, N., and Kasneci, E. (2018b). Rule based
learning for eye movement type detection. In Inter-
national Conference on Multimodal Interaction Work-
shops, ICMIW.
Fuhl, W., Gao, H., and Kasneci, E. (2020a). Neural net-
works for optical vector and eye ball parameter es-
timation. In ACM Symposium on Eye Tracking Re-
search & Applications, ETRA 2020. ACM.
Fuhl, W. and Kasneci, E. (2019). Learning to validate the
quality of detected landmarks. In International Con-
ference on Machine Vision, ICMV.
Fuhl, W., Kasneci, G., and Kasneci, E. (2021). Teyed: Over
20 million real-world eye images with pupil, eyelid,
and iris 2d and 3d segmentations, 2d and 3d land-
marks, 3d eyeball, gaze vector, and eye movement
types. In 2021 IEEE International Symposium on
Mixed and Augmented Reality (ISMAR), pages 367–
375. IEEE.
Fuhl, W., K
¨
ubler, T. C., Sippel, K., Rosenstiel, W., and
Kasneci, E. (2015). Excuse: Robust pupil detection
in real-world scenarios. In 16th International Con-
ference on Computer Analysis of Images and Patterns
(CAIP 2015).
Fuhl, W., Rong, Y., and Enkelejda, K. (2020b). Fully con-
volutional neural networks for raw eye tracking data
segmentation, generation, and reconstruction. In Pro-
ceedings of the International Conference on Pattern
Recognition, pages 0–0.
Fuhl, W., Santini, T., Geisler, D., K
¨
ubler, T. C., Rosenstiel,
W., and Kasneci, E. (2016a). Eyes wide open? eye-
lid location and eye aperture estimation for pervasive
eye tracking in real-world scenarios. In ACM Interna-
tional Joint Conference on Pervasive and Ubiquitous
Computing: Adjunct publication – PETMEI 2016.
Fuhl, W., Santini, T., and Kasneci, E. (2017). Fast and
robust eyelid outline and aperture detection in real-
world scenarios. In IEEE Winter Conference on Ap-
plications of Computer Vision (WACV 2017).
Fuhl, W., Santini, T., K
¨
ubler, T. C., and Kasneci, E. (2016b).
Else: Ellipse selection for robust pupil detection in
real-world environments. In Proceedings of the Ninth
Biennial ACM Symposium on Eye Tracking Research
& Applications (ETRA), pages 123–130.
Gardony, A. L., Lindeman, R. W., and Bruny
´
e, T. T.
(2020). Eye-tracking for human-centered mixed re-
ality: promises and challenges. In Optical Architec-
tures for Displays and Sensing in Augmented, Virtual,
and Mixed Reality (AR, VR, MR), volume 11310, page
113100T. International Society for Optics and Photon-
ics.
Hale, M. L. (2019). Eyestream: An open websocket-based
middleware for serializing and streaming eye tracker
event data from gazepoint gp3 hd research hardware.
Journal of Open Source Software, 4(43):1620.
Hasselbring, W., Carr, L., Hettrick, S., Packer, H., and
Tiropanis, T. (2020). Open source research software.
Computer, 53(8):84–88.
He, K., Zhang, X., Ren, S., and Sun, J. (2016). Deep resid-
ual learning for image recognition. In Proceedings of
the IEEE conference on computer vision and pattern
recognition, pages 770–778.
iMotions (2021). imotions. https://imotions.com/. [Online;
accessed 04-November-2021].
Jermann, P., N
¨
ussli, M.-A., and Li, W. (2010). Using dual
eye-tracking to unveil coordination and expertise in
collaborative tetris. Proceedings of HCI 2010 24,
pages 36–44.
Jiang, Z., Chang, Y., and Liu, X. (2020). Design of
software-defined gateway for industrial interconnec-
tion. Journal of Industrial Information Integration,
18:100130.
Jones, P. R. (2018). Myex: a matlab interface for the tobii
eyex eye-tracker. Journal of Open Research Software,
6(1).
Joseph, A. W. and Murugesh, R. (2020). Potential eye track-
ing metrics and indicators to measure cognitive load in
human-computer interaction research. Journal of Sci-
entific Research, 64(1).
Lepekhin, A., Capo, D., Levina, A., Borremans, A., and
Khasheva, Z. (2020). Adoption of industrie 4.0
technologies in the manufacturing companies in rus-
sia. In Proceedings of the International Scientific
Conference-Digital Transformation on Manufactur-
ing, Infrastructure and Service, pages 1–6.
Lev, A., Braw, Y., Elbaum, T., Wagner, M., and Rassovsky,
Y. (2020). Eye tracking during a continuous perfor-
mance test: Utility for assessing adhd patients. Jour-
nal of Attention Disorders, page 1087054720972786.
Li, D., Babcock, J., and Parkhurst, D. J. (2006). openeyes: a
low-cost head-mounted eye-tracking solution. In Pro-
ceedings of the 2006 symposium on Eye tracking re-
search & applications, pages 95–100.
Liu, C., Chen, Y., Tai, L., Ye, H., Liu, M., and Shi, B. E.
(2019). A gaze model improves autonomous driving.
In Proceedings of the 11th ACM symposium on eye
tracking research & applications, pages 1–5.
Loshchilov, I. and Hutter, F. (2018). Fixing weight decay
regularization in adam.
Mao, R., Li, G., Hildre, H. P., and Zhang, H. (2021). A sur-
vey of eye tracking in automobile and aviation studies:
Implications for eye-tracking studies in marine oper-
ations. IEEE Transactions on Human-Machine Sys-
tems, 51(2):87–98.
Meng, X., Du, R., and Varshney, A. (2020). Eye-
dominance-guided foveated rendering. IEEE trans-
actions on visualization and computer graphics,
26(5):1972–1980.
Nesaratnam, N., Thomas, P., and Vivian, A. (2017). Step-
ping into the virtual unknown: feasibility study of a
virtual reality-based test of ocular misalignment. Eye,
31(10):1503–1506.
Pistol: PUpil INvisible SUpportive TOOl to Extract Pupil, Iris, Eye Opening, Eye Movements, Pupil and Iris Gaze Vector, and 2D as Well as
3D Gaze
37