static or unchanged. This problem may also occur
when the user interface is displayed far from the user
FOV. We tested our approach with 7 participants and
although the initial feedbacks are positive and encour-
aging, we plan to conduct an experiment in real-world
scenarios and report the results in follow-up studies.
As we are improving the system, we are implement-
ing algorithms and methods to detect the digits faster.
In addition, the algorithm was tested with user with
full mobility. Additional evaluation will help under-
stand the effectiveness of the approach in real-world
scenarios with subjects restricted in their motor skills.
6 CONCLUSION
In this paper, we presented a novel eye-based inter-
action entry that uses smooth pursuit eye movement.
A key point of this paradigm is that a selection im-
plies that the user has followed a moving target, thus
eliminating the Midas touch problem. Other applica-
tions may benefit from this interaction technique, for
example entering a flight level in a virtual Air Traf-
fic Control Simulator. Future work will explore digits
recognition time and investigate the potential of this
method on alphanumeric characters.
REFERENCES
Almoctar, H., Irani, P., Peysakhovich, V., and Hurter, C.
(2018). Path word: A multimodal password en-
try method for ad-hoc authentication based on digits’
shape and smooth pursuit eye movements. In Pro-
ceedings of the 20th ACM International Conference
on Multimodal Interaction, ICMI ’18, pages 268–277,
New York, NY, USA. ACM.
Chaconas, N. and Hllerer, T. (2018). An evaluation of bi-
manual gestures on the microsoft hololens. In 2018
IEEE Conference on Virtual Reality and 3D User In-
terfaces (VR), pages 1–8.
Collewijn, H. and Tamminga, E. P. (1984). Human smooth
and saccadic eye movements during voluntary pursuit
of different target motions on different backgrounds.
The Journal of Physiology, 351(1):217–250.
Esteves, A., Velloso, E., Bulling, A., and Gellersen, H.
(2015). Orbits: Gaze interaction for smart watches
using smooth pursuit eye movements. In Proceedings
of the 28th Annual ACM Symposium on User Interface
Software Technology, UIST ’15, pages 457–466, New
York, NY, USA. ACM.
Feit, A. M., Williams, S., Toledo, A., Paradiso, A., Kulka-
rni, H., Kane, S., and Morris, M. R. (2017). Toward
everyday gaze input: Accuracy and precision of eye
tracking and implications for design. In Proceedings
of the 2017 CHI Conference on Human Factors in
Computing Systems, CHI ’17, pages 1118–1130, New
York, NY, USA. ACM.
Fitzgibbon, A., Pilu, M., and Fisher, R. B. (1999). Di-
rect least square fitting of ellipses. IEEE Transac-
tions on Pattern Analysis and Machine Intelligence,
21(5):476–480.
Hassoumi, A., Peysakhovich, V., and Hurter, C. (2018). Un-
certainty visualization of gaze estimation to support
operator-controlled calibration. Journal of Eye Move-
ment Research, 10(5).
Kassner, M., Patera, W., and Bulling, A. (2014). Pupil: An
open source platform for pervasive eye tracking and
mobile gaze-based interaction. In Proceedings of the
2014 ACM International Joint Conference on Perva-
sive and Ubiquitous Computing: Adjunct Publication,
UbiComp ’14 Adjunct, pages 1151–1160, New York,
NY, USA. ACM.
Kyt
¨
o, M., Ens, B., Piumsomboon, T., Lee, G. A., and
Billinghurst, M. (2018). Pinpointing: Precise head-
and eye-based target selection for augmented reality.
In Proceedings of the 2018 CHI Conference on Hu-
man Factors in Computing Systems, CHI ’18, pages
81:1–81:14, New York, NY, USA. ACM.
Mistry, P., Ishii, K., Inami, M., and Igarashi, T. (2010).
Blinkbot: Look at, blink and move. In Adjunct Pro-
ceedings of the 23Nd Annual ACM Symposium on
User Interface Software and Technology, UIST ’10,
pages 397–398, New York, NY, USA. ACM.
Mott, M. E., Williams, S., Wobbrock, J. O., and Morris,
M. R. (2017). Improving dwell-based gaze typing
with dynamic, cascading dwell times. In Proceed-
ings of the 2017 CHI Conference on Human Factors in
Computing Systems, CHI ’17, pages 2558–2570, New
York, NY, USA. ACM.
Piumsomboon, T., Clark, A., Billinghurst, M., and Cock-
burn, A. (2013). User-defined gestures for augmented
reality. In CHI ’13 Extended Abstracts on Human Fac-
tors in Computing Systems, CHI EA ’13, pages 955–
960, New York, NY, USA. ACM.
Santini, T., Fuhl, W., and Kasneci, E. (2017). Calibme:
Fast and unsupervised eye tracker calibration for gaze-
based pervasive human-computer interaction. In Pro-
ceedings of the 2017 CHI Conference on Human Fac-
tors in Computing Systems, CHI ’17, pages 2594–
2605, New York, NY, USA. ACM.
Sklansky, J. (1982). Finding the convex hull of a simple
polygon. Pattern Recogn. Lett., 1(2):79–83.
Velloso, E., Coutinho, F. L., Kurauchi, A., and Morimoto,
C. H. (2018). Circular orbits detection for gaze inter-
action using 2d correlation and profile matching algo-
rithms. In Proceedings of the 2018 ACM Symposium
on Eye Tracking Research & Applications, ETRA ’18,
pages 25:1–25:9, New York, NY, USA. ACM.
Vidal, M., Bulling, A., and Gellersen, H. (2013). Pur-
suits: spontaneous interaction with displays based on
smooth pursuit eye movement and moving targets. In
Proceedings of the 2013 ACM international joint con-
ference on Pervasive and ubiquitous computing, pages
439–448. ACM.
Zhang, X., Kulkarni, H., and Morris, M. R. (2017).
Smartphone-based gaze gesture communication for
people with motor disabilities. In Proceedings of the
2017 CHI Conference on Human Factors in Comput-
ing Systems, CHI ’17, pages 2878–2889, New York,
NY, USA. ACM.
Eye Gesture in a Mixed Reality Environment
187