4 CONCLUSIONS AND
PERSPECTIVES
The system described here has shown efficient for
evaluating SSDs and for designing novel ones. The
motion capture set-up on which it relies is an increas-
ing widespread tool that becomes constantly cheaper
and easier to use.
The two experiment series confirmed that immer-
sion in a virtual 3D space offers worthy means of test-
ing SSDs by conducting various tasks in various en-
vironments, such as identifying objects on a vertical
surface or navigating between rooms. In addition to
environments and tasks, they showed that the motion
capture system allows to easily interchange the point-
ers: any solid object can be converted to a pointer by
positioning markers on it.
The series of trials by human users showed that
both evaluated pointing devices, one held by hand
and the other fastened on the head, lead to equivalent
performances. Other criteria may therefore be taken
into consideration for a final choice in subsequent de-
velopment. Indeed, leaving hands free for other uses
represents a considerable advantage for the headset.
However, scanning around with head movements
proved to be quite uncomfortable for the participants.
In typical non blind navigation search, exploring
is essentially realized by eye saccades thus the am-
plitude of head movement remains limited. There-
fore, eye movements may direct the pointer leaving
both hands free and reduce head movements. These
advantages deserve to be tested, in the ultimate per-
spective of helping people whose eye muscles remain
functional while suffering of late retinal blindness. In
the line of (Twardon et al., 2013) and (Dietz et al.,
2016), we began to work on eye pointing for visual
sensory substitution.
!"#$%&'(
)*+"#,*-)*$
!"#$%&'
.)*.,#
/#&*.0,1)# 20$%&$,#
304(5&6)(+)0$,# "*($7)(-,$",*(
0&8$%#)(#)9)#)*$"&'
:,$",*(
0&8$%#)
;:<
"!")=>/5
;:<
"!")=>/5
?@AB@
3C4(5&6)(0,,#1"*&$). "*($7)(
D'&..).(#)9)#)*$"&'
3&4(5'&..).(0,,#1"*&$).
"*($7)(-,$",*(
0&8$%#)(#)9)#)*$"&'
VRPN
VRPN
Figure 6: Experimental set-up with eye-tracker (see text for
more details).
Fig. 6 displays the evolution of our presented
experimental set-up to do studies on the pointing with
the eyes in the blind. It is based on a SMI portable
eye-tracker mounted on a spectacle frame equipped
with reflecting markers. Markers allow the VICON
system to determine the position and orientation of
the glasses (fig. 6(a)). The SMI iViewETG software
determines eye gaze coordinates within the glasses
referential (fig. 6(b)). Finally, the SMI 3D-6D
software merges these two information to deliver
the gaze vectors in the VICON system referential
through a VRPN server (fig. 6(c)). The genericity
offered by VRPN allowed to keep unchanged the rest
of the set-up. It is transparent whether the pointer is a
stick, a headset or the gaze. This prototype have been
currently tested by human participants in conditions
similar to the first experimental series.
Many other, hardware and software, implementa-
tions may be prototyped not only at the input level
(environment and sensors), but also at the transcoder
and the output level.
ACKNOWLEDGEMENTS
We would like to thank Silvain Gerber from GIPSA-
Lab for his availability and help on the statistical anal-
ysis of the experiments.
Graduate students in engineering, neurosciences
and psychology contributed to the project.
This work has been partially supported by the
LabEx PERSYVAL-Lab (ANR-11-LABX-0025-01),
the AGIR PEPS program of Univ. Grenoble Alpes
ComUE and the Pole Grenoble Cognition.
REFERENCES
Abboud, S., Hanassy, S., Levy-Tzedek, S., Maidenbaum,
S., and Amedi, A. (2014). Eyemusic: Introducing a
”visual” colorful experience for the blind using audi-
tory sensory substitution. Restorative neurology and
neuroscience.
Bach-Y-Rita, P., Collins, C. C., Saunders, F. A., White, B.,
and Scadden, L. (1969). Vision substitution by tactile
image projection. Nature, 221(5184):963–964.
Bates, D., M
¨
achler, M., Bolker, B., and Walker, S. (2015).
Fitting linear mixed-effects models using lme4. Jour-
nal of Statistical Software, 67(1):1–48.
Dietz, M., Garf, M. E., Damian, I., and Andr
´
e, E. (2016).
Exploring eye-tracking-driven sonification for the vi-
sually impaired. In Proceedings of the 7th Aug-
mented Human International Conference 2016, AH
’16, pages 5:1–5:8.
Prototyping and Evaluating Sensory Substitution Devices by Spatial Immersion in Virtual Environments
601