for visual inspection training. Proceedings of the ACM
symposium on Virtual reality software and technology
- VRST ’01, page 1.
Essig, K., Pomplun, M., and Ritter, H. (2006). A neural net-
work for 3D gaze recording with binocular eye track-
ers. International Journal of Parallel, Emergent and
Distributed Systems, 21(February 2015):79–95.
Fletcher, L., Loy, G., Barnes, N., and Zelinsky, A. (2005).
Correlating driver gaze with the road scene for driver
assistance systems. Robotics and Autonomous Sys-
tems, 52(1):71–84.
Fuhl, W., K
¨
ubler, T. C., Sippel, K., Rosenstiel, W., and
Kasneci, E. (2015a). Excuse: Robust pupil detec-
tion in real-world scenarios. In Azzopardi, G. and
Petkov, N., editors, Computer Analysis of Images and
Patterns - 16th International Conference, CAIP 2015,
Valletta, Malta, September 2-4, 2015 Proceedings,
Part I, volume 9256 of Lecture Notes in Computer Sci-
ence, pages 39–51. Springer.
Fuhl, W., Santini, T. C., Kuebler, T., and Kasneci,
E. (2015b). ElSe: Ellipse Selection for Ro-
bust Pupil Detection in Real-World Environments.
arxiv:1511.06575.
Gidl
¨
of, K., Wallin, A., Dewhurst, R., and Holmqvist, K.
(2013). Using eye tracking to trace a cognitive pro-
cess: Gaze behaviour during decision making in a nat-
ural environment. Journal of Eye Movement Research,
6(1):1–14.
Healy, A. F. and Proctor, R. W. (2003). Handbook of psy-
chology: Experimental psychology.
Hillaire, S., Lecuyer, A., Cozot, R., and Casiez, G. (2008).
Using an eye-tracking system to improve camera mo-
tions and depth-of-field blur effects in virtual environ-
ments. In Virtual Reality Conference, 2008. VR ’08.
IEEE, pages 47–50.
Howard, I. P. (2012). Depth from accommodation and ver-
gence. In Perceiving in DepthVolume 3 Other Mech-
anisms of Depth Perception, pages 1–14. Oxford Uni-
versity Press (OUP).
Kandel, E. R., Schwartz, J. H., and Jessell, T. M. (2000).
Principles of neural science. McGraw-Hill, New
York.
Kasneci, E., Kasneci, G., K
¨
ubler, T. C., and Rosenstiel, W.
(2015). Online Recognition of Fixations, Saccades,
and Smooth Pursuits for Automated Analysis of Traf-
fic Hazard Perception. In Artificial Neural Networks,
volume 4 of Springer Series in Bio-/Neuroinformatics,
pages 411–434. Springer International Publishing.
Kasneci, E., Sippel, K., Heister, M., Aehling, K., Rosen-
stiel, W., Schiefer, U., and Papageorgiou, E. (2014).
Homonymous visual field loss and its impact on vi-
sual exploration: A supermarket study. TVST, 3(6).
Kasthurirangan, S. (2014). Current methods for objectively
measuring accommodation. Presented as AAO Work-
shop on Developing Novel Endpoints for Premium In-
traocular Lenses.
Kourkoumelis, N. and Tzaphlidou, M. (2011). Eye safety
related to near infrared radiation exposure to biometric
devices. The Scientific World Journal, 11:520–528.
Lopes, P., Lavoie, R., Faldu, R., Aquino, N., Barron, J.,
Kante, M., and (advisor, W. M. (2012). Icraft eye-
controlled robotic feeding arm technology members.
Lukic, L., Santos-Victor, J., and Billard, A. (2014). Learn-
ing robotic eye—arm—hand coordination from hu-
man demonstration: A coupled dynamical systems ap-
proach. Biol. Cybern., 108(2):223–248.
Mulvey, F., Villanueva, A., Sliney, D., Lange, R., Cotmore,
S., and Donegan, M. (2008). Exploration of safety
issues in eyetracking. Technical Report IST-2003-
511598, COGAIN EU Network of Excellence.
Reichelt, S., Haussler, R., F
¨
utterer, G., and Leister, N.
(2010). Depth cues in human visual perception
and their realization in 3D displays. In Three Di-
mensional Imaging, Visualization, and Display 2010,
pages 76900B–76900B–12.
Rogers, A. (1988). Mosby’s guide to physical examination.
Journal of anatomy, 157:235.
Schaeffel, F., Wilhelm, H., and Zrenner, E. (1993). Inter-
individual variability in the dynamics of natural ac-
commodation in humans: relation to age and refrac-
tive errors. The Journal of Physiology, 461(1):301–
320.
Shih, S.-W. and Liu, J. (2004). A novel approach to 3-d gaze
tracking using stereo cameras. IEEE Transactions on
Syst. Man and Cybern., part B, 34:234–245.
Sunday, D. (2012). Distance between 3d lines & segments.
´
Swirski, L. and Dodgson, N. (2013). A fully-automatic,
temporal approach to single camera, glint-free 3D eye
model fitting. Proc. PETMEI.
Tafaj, E., Kasneci, G., Rosenstiel, W., and Bogdan, M.
(2012). Bayesian online clustering of eye movement
data. In Proceedings of the Symposium on Eye Track-
ing Research and Applications, ETRA ’12, pages
285–288. ACM.
Wang, R. I., Pelfrey, B., Duchowski, A. T., and House, D. H.
(2014). Online 3D Gaze Localization on Stereoscopic
Displays. ACM Transactions on Applied Perception,
11(1):1–21.
3D Gaze Estimation using Eye Vergence
131