Figure 4: Error rate before and after feature selection.
Table 4: Best Feature selection algorithms and features
selected.
Algorithm 1 2 3 4 5 6 7 8 9
BFS 1 2 3 4 5 6 7
B&B 2 4 7 5 1 6 3
5 CONCLUSIONS
This work has relied on techniques developed in the
field of behavioural biometric; HCI based biometric
modalities and combining results with techniques
from gaze tracking, pupillometry and facial feature
extraction to create a new biometric modality based
on gaze. From the preliminary result obtained, it can
be seen that gaze information may have some
potential for being used as a biometric modality. The
experiments carried out were only done on a very
small sample; more testing is required to confirm the
preliminary findings of this project.
A gaze-based biometric modality would be both
an affordable and nonintrusive way of verifying the
user’s identity. In addition, a gaze-based biometric
modality would also open the way to a number of
new application areas. It would be well suited for
verification of users which interact with a whole
range of devices containing a camera such as smart
phones, personal computers etc. Gaze-based
biometric systems could also be used as a remote
authentication system for web sites or e-commerce
sites. Another potential area where such
technologies can be used is in liveness detection. In
such an approach, liveness detection may be based
on the movement of the pupil using as stimulus
either the variation of lighting condition or using
images.
Future research on this topic should be directed
at increasing overall accuracy of the gaze tracking
system as well as looking into possibility of
developing multimodal biometric system based on
other existing biometric modalities such as iris,
fingerprint or HCI-based biometric modalities such
as keystroke or mouse dynamics.
REFERENCES
Adolphs, R. (2006). A landmark study finds that when we
look at sad faces, the size of the pupil we look at
influences the size of our own pupil. Social Cognitive
and Affective Neuroscience, 1(1), 3-4. doi:10.1093/
scan/nsl011
Castelhano, M. S., Mack, M. L., & Henderson, J. M.
(2009). Viewing task influences eye movement control
during active scene perception. Journal of Vision, 9(3)
doi:10.1167/9.3.6
Castelhano, M. S., Wieth, M., & Henderson, J. M. (2008).
I see what you see: Eye movements in real-world
scenes are affected by perceived direction of gaze.,
251-262. doi:http://dx.doi.org/10.1007/978-3-540-
77343-6_16
Chao-Ning Chan, Oe, S., & Chern-Sheng Lin. (2007).
Active eye-tracking system by using quad PTZ
cameras. Industrial Electronics Society, 2007. IECON
2007. 33rd Annual Conference of the IEEE, 2389-
2394.
Chapran, J., Fairhurst, M. C., Guest, R. M., & Ujam, C.
(2008). Task-related population characteristics in
handwriting analysis. Computer Vision, IET, 2(2), 75-
87.
Duchowski, A. T. (2007). Eye tracking methodology:
Theory and practice. Secaucus, NJ, USA: Springer-
Verlag New York, Inc.
Engelke, U., Maeder, A., & Zepernick, H. -. (2009).
Visual attention modelling for subjective image
quality databases. , Rio De Janeiro
Goudelis, G., Tefas, A., & Pitas, I. (2009). Emerging
biometric modalities: A survey. Journal on
Multimodal User Interfaces, , 1-19.
doi:10.1007/s12193-009-0020-x
Gutiérrez-García, J. O., Ramos-Corchado, F. F., & Unger,
H. (2007). User authentication via mouse biometrics
and the usage of graphic user interfaces: An
application approach. 2007 International Conference
on Security and Management, SAM'07, Las Vegas,
NV. 76-82.
Harrison, N. A., Singer, T., Rotshtein, P., Dolan, R. J., &
Critchley, H. D. (2006). Pupillary contagion: Central
mechanisms engaged in sadness processing. Social
Cognitive and Affective Neuroscience, 1(1), 5-17.
doi:10.1093/scan/nsl006
Judd, T., Ehinger, K., Durand, F., & Torralba, A. (2009).
Learning to predict where humans look. IEEE
International Conference on Computer Vision (ICCV),
Le Callet, P., & Autrusseau, F. (2005). Subjective quality
BIOSIGNALS 2011 - International Conference on Bio-inspired Systems and Signal Processing
340