Authors:
Pedro Miguel Faria
;
Rodrigo A. M. Braga
;
Eduardo Valgôde
and
Luís Paulo Reis
Affiliation:
LIACC – Artificial Intelligence and Computer Science Lab. – University of Porto; FEUP – Faculty of Engineering of University of Porto, Portugal
Keyword(s):
Human-computer interface, computer vision, image processing, artificial intelligence, intelligent wheelchair.
Related
Ontology
Subjects/Areas/Topics:
Accessibility to Disabled Users
;
Artificial Intelligence
;
Artificial Intelligence and Decision Support Systems
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computational Intelligence
;
Computer-Supported Education
;
Enterprise Information Systems
;
Health Engineering and Technology Applications
;
Human Factors
;
Human-Computer Interaction
;
Intelligent User Interfaces
;
Machine Perception: Vision, Speech, Other
;
Methodologies and Methods
;
Multimedia Systems
;
Neural Network Software and Applications
;
Neural Networks
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Signal Processing
;
Soft Computing
;
Theory and Methods
;
Ubiquitous Learning
;
User Needs
Abstract:
Many of the physically injured use electric wheelchairs as an aid to locomotion. Usually, for commanding this type of wheelchair, it is required the use of one’s hands and this poses a problem to those who, besides being unable to use their legs, are also unable to properly use their hands. The aim of the work described here, is to create a prototype of a wheelchair command interface that do not require hand usage. Facial expressions were chosen instead, to provide the necessary visual information for the interface to recognize user commands. The facial expressions are captured by a digital camera and interpreted by an application running on a laptop computer on the wheelchair. The software includes digital image processing algorithms for feature detection, such as colour segmentation and edge detection, followed by the application of a neural network that uses these features to detect the desired facial expressions. A simple simulator, built on top of the known (Ciber-Mouse) was use
d to validate the approach by simulating the control of the intelligent wheelchair in a hospital environment. The results obtained from the platform provide strong evidence that it is possible to comfortably drive an intelligent wheelchair using facial expressions.
(More)