Authors:
Fabian Timm
and
Erhardt Barth
Affiliation:
University of Lübeck and Innovations Campus Lübeck, Germany
Keyword(s):
Eye centre localisation, Pupil and iris localisation, Image gradients, Feature extraction, Shape analysis.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computer Vision, Visualization and Computer Graphics
;
Data Manipulation
;
Feature Extraction
;
Features Extraction
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Image and Video Analysis
;
Image Shape Analysis
;
Informatics in Control, Automation and Robotics
;
Methodologies and Methods
;
Motion, Tracking and Stereo Vision
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Real-Time Vision
;
Sensor Networks
;
Signal Processing, Sensors, Systems Modeling and Control
;
Soft Computing
;
Tracking of People and Surveillance
Abstract:
The estimation of the eye centres is used in several computer vision applications such as face recognition or eye tracking. Especially for the latter, systems that are remote and rely on available light have become very popular and several methods for accurate eye centre localisation have been proposed. Nevertheless, these methods often fail to accurately estimate the eye centres in difficult scenarios, e.g. low resolution, low contrast, or occlusions. We therefore propose an approach for accurate and robust eye centre localisation by using image gradients. We derive a simple objective function, which only consists of dot products. The maximum of this function corresponds to the location where most gradient vectors intersect and thus to the eye’s centre. Although simple, our method is invariant to changes in scale, pose, contrast and variations in illumination. We extensively evaluate our method on the very challenging BioID database for eye centre and iris localisation. Moreover, we
compare our method with a wide range of state of the art methods and demonstrate that our method yields a significant improvement regarding both accuracy and robustness.
(More)