Authors:
Marcos Hervás
;
Rosa Ma Alsina-Pagès
and
Joan Navarro
Affiliation:
La Salle - Universitat Ramon Llull, Spain
Keyword(s):
Ambient Assisted Living, Sensor Network, Machine Hearing, Audio Feature Extraction, Machine Learning, Graphics Processor Unit.
Related
Ontology
Subjects/Areas/Topics:
Applications and Uses
;
Home Monitoring and Assisted Living Applications
;
Sensor Networks
Abstract:
Human life expectancy has steadily grown over the last century, which has driven governments and institutions to increase the efforts on caring about the eldest segment of the population. The first answer to that increasing need was the building of hospitals and retirement homes, but these facilities have been rapidly overfilled and their associated maintenance costs are becoming far prohibitive. Therefore, modern trends attempt to take advantage of latest advances in technology and communications to remotely monitor those people with special needs at their own home, increasing their life quality and with much less impact on their social lives. Nonetheless, this approach still requires a considerable amount of qualified medical personnel to track every patient at any time. The purpose of this paper is to present an acoustic event detection platform for assisted living that tracks patients status by automatically identifying and analyzing the acoustic events happening in a house. Spec
ifically, we have taken benefit of the amazing capabilities of a Jetson TK1, with its NVIDIA Graphical Processing Unit, to collect the data in the house and process it to identify a closed number of events, which could led doctors or care assistants in real-time by tracking the patient at home. This is a proof of concept conducted with data of only one acoustic sensor, but in the future we have planned to extract information of the sensor network placed in several places in the house.
(More)