Figure 3: ElSe (Fuhl et al., 2016) workflow. (1) input image. (2) Canny edge image. (3) The remaining edges after curvature
analysis, analysis of the enclosed intensity value, and shape. (4) The curvature with the most enclosing low intensity values
and the most circular ellipse is chosen as the pupil boundary. (5) The resulting pupil center estimate. In case the initial
estimate fails to produce a valid ellipse: the input image (6) is downscaled (7). (8) The downscaled image after convolution
with a mean and a surface difference filters. (9) The threshold for pupil region extraction is calculated and used for pupil area
estimation. (10) The resulting pupil center estimate.
Kanade, 2012;
´
Swirski and Dodgson, 2013), support
for remote gaze estimation (Model and Eizenman,
2010; Yamazoe et al., 2008), additional calibration
methods (Guestrin and Eizenman, 2006; Pirri et al.,
2011), real-time eye movement classification based
on Bayesian mixture models (Kasneci et al., 2014;
Kasneci et al., 2015; Santini et al., 2016), automatic
blink detection, adding support for other eye track-
ers, and adding support for binocular systems as well
as external, user definable, triggers events. Feedback
from the community on future features is also wel-
come.
Source code, binaries for Windows, and extensive
documentation are available at:
www.perception.uni-tuebingen.de
REFERENCES
Bradski, G. et al. (2000). The OpenCV Library. Doctor
Dobbs Journal, 25(11):120–126.
Dalmaijer, E. S., Math
ˆ
ot, S., and Van der Stigchel, S.
(2014). Pygaze: An open-source, cross-platform tool-
box for minimal-effort programming of eyetracking
experiments. Behavior research methods, 46(4).
Duchowski, A. T. (2002). A breadth-first survey of eye-
tracking applications. Behavior Research Methods,
Instruments, & Computers, 34(4):455–470.
Ergoneers (2014). D-Lab Manual.
Ergoneers (2015). Dikablis Glasses.
http://www.ergoneers.com/en/hardware/eye-
tracking/.
Fuhl, W., K
¨
ubler, T., Sippel, K., Rosenstiel, W., and Kas-
neci, E. (2015). Excuse: Robust pupil detection in
real-world scenarios. In Computer Analysis of Im-
ages and Patterns 2015. CAIP 2015. 16th Interna-
tional Conference. IEEE.
Fuhl, W., Santini, T., K
¨
ubler, T., and Kasneci, E. (2016).
Else: Ellipse selection for robust pupil detection in
real-world environments. In Proceedings of the Sym-
posium on Eye Tracking Research and Applications.
ACM. Forthcoming.
Guestrin, E. D. and Eizenman, M. (2006). General theory
of remote gaze estimation using the pupil center and
corneal reflections. Biomedical Engineering, IEEE
Transactions on, 53(6):1124–1133.
Holmqvist, K., Nystr
¨
om, M., Andersson, R., Dewhurst, R.,
Jarodzka, H., and Van de Weijer, J. (2011). Eye track-
ing: A comprehensive guide to methods and measures.
Oxford University Press.
Holsanova, J., Rahm, H., and Holmqvist, K. (2006). Entry
points and reading paths on newspaper spreads: com-
paring a semiotic analysis with eye-tracking measure-
ments. Visual communication, 5(1):65–93.
Javadi, A.-H., Hakimi, Z., Barati, M., Walsh, V., and
Tcheang, L. (2015). Set: a pupil detection method
using sinusoidal approximation. Frontiers in neuro-
engineering, 8.
Kasneci, E., Kasneci, G., K
¨
ubler, T. C., and Rosenstiel, W.
(2014). The applicability of probabilistic methods to
the online recognition of fixations and saccades in dy-
namic scenes. In Proceedings of the Symposium on
Eye Tracking Research and Applications, ETRA ’14,
pages 323–326, New York, NY, USA. ACM.
Kasneci, E., Kasneci, G., K
¨
ubler, T. C., and Rosenstiel, W.
(2015). Online recognition of fixations, saccades, and
smooth pursuits for automated analysis of traffic haz-
ard perception. In Koprinkova-Hristova, P., Mlade-
nov, V., and Kasabov, N. K., editors, Artificial Neu-
ral Networks, volume 4 of Springer Series in Bio-
/Neuroinformatics, pages 411–434. Springer Interna-
tional Publishing.
Kassner, M., Patera, W., and Bulling, A. (2014). Pupil: An
VISAPP 2016 - International Conference on Computer Vision Theory and Applications
390