loading
Papers

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Pedro Miguel Faria ; Rodrigo A. M. Braga ; Eduardo Valgôde and Luís Paulo Reis

Affiliation: LIACC – Artificial Intelligence and Computer Science Lab. – University of Porto; FEUP – Faculty of Engineering of University of Porto, Portugal

ISBN: 978-972-8865-92-4

Keyword(s): Human-computer interface, computer vision, image processing, artificial intelligence, intelligent wheelchair.

Related Ontology Subjects/Areas/Topics: Accessibility to Disabled Users ; Artificial Intelligence ; Artificial Intelligence and Decision Support Systems ; Biomedical Engineering ; Biomedical Signal Processing ; Computational Intelligence ; Computer-Supported Education ; Enterprise Information Systems ; Health Engineering and Technology Applications ; Human Factors ; Human-Computer Interaction ; Intelligent User Interfaces ; Machine Perception: Vision, Speech, Other ; Methodologies and Methods ; Multimedia Systems ; Neural Network Software and Applications ; Neural Networks ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Signal Processing ; Soft Computing ; Theory and Methods ; Ubiquitous Learning ; User Needs

Abstract: Many of the physically injured use electric wheelchairs as an aid to locomotion. Usually, for commanding this type of wheelchair, it is required the use of one’s hands and this poses a problem to those who, besides being unable to use their legs, are also unable to properly use their hands. The aim of the work described here, is to create a prototype of a wheelchair command interface that do not require hand usage. Facial expressions were chosen instead, to provide the necessary visual information for the interface to recognize user commands. The facial expressions are captured by a digital camera and interpreted by an application running on a laptop computer on the wheelchair. The software includes digital image processing algorithms for feature detection, such as colour segmentation and edge detection, followed by the application of a neural network that uses these features to detect the desired facial expressions. A simple simulator, built on top of the known (Ciber-Mouse) was used to validate the approach by simulating the control of the intelligent wheelchair in a hospital environment. The results obtained from the platform provide strong evidence that it is possible to comfortably drive an intelligent wheelchair using facial expressions. (More)

PDF ImageFull Text

Download
CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.208.186.19

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Miguel Faria P.; A. M. Braga R.; Valgôde E.; Paulo Reis L. and (2007). PLATFORM TO DRIVE AN INTELLIGENT WHEELCHAIR USING FACIAL EXPRESSIONS.In Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 4: ICEIS, ISBN 978-972-8865-92-4, pages 164-169. DOI: 10.5220/0002394101640169

@conference{iceis07,
author={Pedro {Miguel Faria} and Rodrigo {A. M. Braga} and Eduardo Valgôde and Luís {Paulo Reis}},
title={PLATFORM TO DRIVE AN INTELLIGENT WHEELCHAIR USING FACIAL EXPRESSIONS},
booktitle={Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 4: ICEIS,},
year={2007},
pages={164-169},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002394101640169},
isbn={978-972-8865-92-4},
}

TY - CONF

JO - Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 4: ICEIS,
TI - PLATFORM TO DRIVE AN INTELLIGENT WHEELCHAIR USING FACIAL EXPRESSIONS
SN - 978-972-8865-92-4
AU - Miguel Faria, P.
AU - A. M. Braga, R.
AU - Valgôde, E.
AU - Paulo Reis, L.
PY - 2007
SP - 164
EP - 169
DO - 10.5220/0002394101640169

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.