Authors:
Aziliz Guezou-Philippe
1
;
Sylvain Huet
1
;
Denis Pellerin
1
and
Christian Graff
2
Affiliations:
1
Univ. Grenoble Alpes, France
;
2
Univ. Grenoble Alpes and Univ. Savoie Mont Blanc, France
Keyword(s):
Sensory Substitution, Virtual Environments, Motion Capture, Pointing Device, Eye Tracking.
Related
Ontology
Subjects/Areas/Topics:
Applications and Services
;
Computer Vision, Visualization and Computer Graphics
;
Enterprise Information Systems
;
Human and Computer Interaction
;
Human-Computer Interaction
Abstract:
Various audio-vision Sensory Substitution Devices (SSDs) are in development to assist people without sight.
They all convert optical information extracted from a camera, into sound parameters but are evaluated for
different tasks in different contexts. The use of 3D environments is proposed here to compare the advantages
and disadvantages of not only software (transcoding) solutions but also of hardware (component) specifics,
in various situations and activities. By use of a motion capture system, the whole person, not just a guided
avatar, was immersed in virtual places that were modelled and that could be replicated at will. We evaluated
the ability to hear depth for various tasks: detecting and locating an open window, moving and crossing an
open door. Participants directed the modelled depth-camera with a real pointing device that was either held
in the hand or fastened on the head. Mixed effects on response delays were analyzed with a linear model to
highlight the respe
ctive importance of the pointing device, the target specifics and the individual participants.
Results are encouraging to further exploit our prototyping set-up and test many solutions by implementing
e.g., environments, sensor devices, transcoding rules, and pointing devices including the use of an eye-tracker.
(More)