
system. They can replicate this proposed implemen-
tation using their own laptops’ webcams and evaluate
the accuracy of their systems. Additionally, varying
the distance between the camera and the users’ eyes
is an important factor to be tested during experiments.
Furthermore, implementing a new customized archi-
tecture based on the methodology provided in this pa-
per could be a proper approach in this field. An ini-
tial starting point to improve the results presented in
this work will be to modify some hyper-parameters to
conduct more in-depth tests on the proposed architec-
ture.
Finally, we find gaze-tracking algorithm applica-
tions useful, and we certainly will continue develop-
ing some of those described in Section 2.2 as part of
our future work and invite readers to explore a wide
range of applications, such as a shooter game aim as-
sistant, mouse assistant controller for disabled people
to use in driving a vehicle, or even neuromarketing
studies based on visual attention analysis as a solu-
tion for products or ad placement in websites like e-
commerce or social media pages.
REFERENCES
Anantha Prabha, P., Srinivash, K., Vigneshwar, S., and
Viswa, E. (2022). Mouse assistance for motor-
disabled people using computer vision. In Proceed-
ings of International Conference on Recent Trends in
Computing, pages 403–413. Springer.
Cazzato, D., Leo, M., Distante, C., and Voos, H. (2020).
When I look into your eyes: A survey on computer
vision contributions for human gaze estimation and
tracking. Sensors, 20(13):3739.
Chen, H.-H., Hwang, B.-J., Wu, J.-S., and Liu, P.-T. (2020).
The effect of different deep network architectures
upon cnn-based gaze tracking. Algorithms, 13(5):127.
de Lope, J. and Gra
˜
na, M. (2022). Deep transfer learning-
based gaze tracking for behavioral activity recogni-
tion. Neurocomputing, 500:518–527.
Dilini, N., Senaratne, A., Yasarathna, T., Warnajith, N.,
and Seneviratne, L. (2021). Cheating detection in
browser-based online exams through eye gaze track-
ing. In 2021 6th International Conference on Informa-
tion Technology Research (ICITR), pages 1–8. IEEE.
Gudi, A., Li, X., and van Gemert, J. (2020). Efficiency in
real-time webcam gaze tracking. In Computer Vision–
ECCV 2020 Workshops: Glasgow, UK, August 23–
28, 2020, Proceedings, Part I 16, pages 529–543.
Springer.
Holmqvist, K. and Andersson, R. (2017). Eye tracking:
A comprehensive guide to methods. paradigms and
measures.
Huang, J., Zhang, Z., Xie, G., and He, H. (2021). Real-time
precise human-computer interaction system based on
gaze estimation and tracking. Wireless Communica-
tions and Mobile Computing, 2021.
Kaur, H., Jindal, S., and Manduchi, R. (2022). Rethink-
ing model-based gaze estimation. Proceedings of
the ACM on computer graphics and interactive tech-
niques, 5(2).
Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhan-
darkar, S., Matusik, W., and Torralba, A. (2016). Eye
tracking for everyone. In Proceedings of the IEEE
conference on computer vision and pattern recogni-
tion, pages 2176–2184.
Liu, J., Chi, J., Hu, W., and Wang, Z. (2020). 3d model-
based gaze tracking via iris features with a single cam-
era and a single light source. IEEE Transactions on
Human-Machine Systems, 51(2):75–86.
Liu, J., Chi, J., Yang, H., and Yin, X. (2022). In the eye
of the beholder: A survey of gaze tracking techniques.
Pattern Recognition, page 108944.
Lu, F., Sugano, Y., Okabe, T., and Sato, Y. (2014). Adap-
tive linear regression for appearance-based gaze esti-
mation. IEEE transactions on pattern analysis and
machine intelligence, 36(10):2033–2046.
Mahanama, B., Jayawardana, Y., and Jayarathna, S. (2020).
Gaze-net: Appearance-based gaze estimation using
capsule networks. In Proceedings of the 11th aug-
mented human international conference, pages 1–4.
Martinez, F., Carbone, A., and Pissaloux, E. (2012). Gaze
estimation using local features and non-linear regres-
sion. In 2012 19th IEEE International Conference on
Image Processing, pages 1961–1964. IEEE.
Modi, N. and Singh, J. (2021). A review of various state of
art eye gaze estimation techniques. Advances in Com-
putational Intelligence and Communication Technol-
ogy: Proceedings of CICT 2019, pages 501–510.
Ou, W.-L., Kuo, T.-L., Chang, C.-C., and Fan, C.-P. (2021).
Deep-learning-based pupil center detection and track-
ing technology for visible-light wearable gaze track-
ing devices. Applied Sciences, 11(2):851.
Sabab, S. A., Kabir, M. R., Hussain, S. R., Mahmud, H.,
Hasan, M., Rubaiyeat, H. A., et al. (2022). Vis-itrack:
Visual intention through gaze tracking using low-cost
webcam. arXiv preprint arXiv:2202.02587.
Sharma, K., Giannakos, M., and Dillenbourg, P. (2020).
Eye-tracking and artificial intelligence to enhance mo-
tivation and learning. Smart Learning Environments,
7(1):1–19.
Sharma, P., Joshi, S., Gautam, S., Maharjan, S., Khanal,
S. R., Reis, M. C., Barroso, J., and de Jesus Filipe,
V. M. (2023). Student engagement detection using
emotion analysis, eye tracking and head movement
with machine learning. In Technology and Innovation
in Learning, Teaching and Education: Third Interna-
tional Conference, TECH-EDU 2022, Lisbon, Portu-
gal, August 31–September 2, 2022, Revised Selected
Papers, pages 52–68. Springer.
Werchan, D. M., Thomason, M. E., and Brito, N. H. (2022).
Owlet: An automated, open-source method for infant
gaze tracking using smartphone and webcam record-
ings. Behavior Research Methods, pages 1–15.
Zhang, X., Sugano, Y., Fritz, M., and Bulling, A. (2017).
It’s written all over your face: Full-face appearance-
based gaze estimation. In Proceedings of the IEEE
conference on computer vision and pattern recogni-
tion workshops, pages 51–60.
A Webcam Artificial Intelligence-Based Gaze-Tracking Algorithm
235