Authors:
Daniel Hahn
;
Frederik Beutler
and
Uwe D. Hanebeck
Affiliation:
Intelligent Sensor-Actuator-Systems Laboratory, Institute of Computer Science and Engineering, Universität Karlsruhe (TH), Germany
Keyword(s):
Augmented Reality, Human-Machine-Interface.
Related
Ontology
Subjects/Areas/Topics:
Image Processing
;
Informatics in Control, Automation and Robotics
;
Robotics and Automation
;
Virtual Environment, Virtual and Augmented Reality
Abstract:
In this paper we present an assistive system for hearing-impaired people that consists of a wearable microphone array and an Augmented Reality (AR) system. This system helps the user in communication situations, where many speakers or sources of background noise are present. In order to restore the “cocktail party” effect multiple microphones are used to estimate the position of individual sound sources. In order to allow the user to interact in complex situations with many speakers, an algorithm for estimating the user’s attention is developed. This algorithm determines the sound sources, which are in the user’s focus of attention. It allows the system to discard irrelevant information and enables the user to focus on certain aspects of the surroundings. Based on the user’s hearing impairment, the perception of the speaker in the focus of attention can be enhanced, e.g. by amplification or using a speech-to-text conversion.
A prototype has been built for evaluating this approach. C
urrently the prototype is able to locate sound beacons in three-dimensional space, to perform a simple focus estimation, and to present floating captions in the Augmented Reality. The prototype uses an intentionally simple user interface, in order to minimize distractions.
(More)