TACTILE GUIDANCE OF THE HAND IN A BLIND POINTING
TASK: “THE TACTILE COMPASS”
Tactile Compass in a Blind Pointing Task
M-C. Lepelley, L. Lejeune, F. Thullier, E. Faugloire and F. G. Lestienne
ERT 2002 « Rapsodie », EA 4260 IOA, MODESCO UMS 843 CNRS, Université de Caen Basse-Normandie, France
Keywords: Vibrotactile device, Tactile compass, Spatial guidance, Directional prescriptors, Pointing task.
Abstract: Using tactile skin receptors that are sensitive to vibrations thereby allowing the use of a “tactile compass”
made up of a matrix of micro-vibrators that reproduce tactile encoding on the skin surface to orient the
wearer. The tactile compass used in this study consisted in 49 microvibrators laid out in a 7x7 matrix. The
49 microvibrators contained inertial vibrators activated by micromotors. The tactile messages were provided
in a dynamic way by the successive activation of each microvibrators. The present study investigated the
efficiency of the tactile compass in guiding the hand in a blind pointing task when inserted into an
abdominal girdle. More specifically, the performances obtained using tactile coding are compared to those
obtained using verbal instructions. The participants had to point, from the central target towards one of the
four other targets each corresponding to one of the six directions (upwards, downwards, left, right,
backwards and forwards) located either in the frontal plane or in the horizontal plane. Overall, the results
reveal the efficiency for gesture guidance of providing tactile messages in a dynamic way, without
involving learning. In addition, they establish that tactile information transmitted via our vibrotactile device
is involved in the processes of both motor control and production of movement in tridimensional space.
1 INTRODUCTION
The tactile compass is based on the well-known
clinical test of “skin writing” which consists in
tracing different characters (letters or digits) on the
body surface of the subject. If this test can reveal
changes at the level of tactile receptors, as well as
changes in the central nervous system (CNS) at low
level, by contrast, the tactile compass test can yield
valuable data about mechanisms underlying
perceptual and gnostic functions at a higher level of
the CNS (Natsoulas and Dubanoski, 1964; Caffara et
al., 1976; Parsons and Shimojo, 1987; Gurfinkel et
al., 1993).
Taking into account that: (1) localisation of the
skin’s mechanoreceptors is quite well represented in
the CNS (Phillips, 1988), (2) the brain’s processing
of the tactile signal is characterised by a high level
of sensitivity, acuity and rapidity (Johansson et al.,
1982; Johnson and Phillips, 1981), the
vibrostimulation of the skin is a useful way to
present information in tactile form, instead of
visually or in auditory form (Bliss et al., 1970;
Jagacinski et al., 1979).
The use of a “tactile retina” as a substitute of the
retina of the eye in the blind has been extensively
studied over several years. For example, early
pioneering work in the Tactile Vision Substitution
System (TVSS) was performed by Paul Bach-y-Rita
and colleagues in the late 1960s. The TVSS
displayed visual information captured by a tripod
mounted TV camera to a vibrotactile display on the
user’s back (Bach-y-Rita, 1982, 2004).
According to the ‘tap-on-the-shoulder’ principle
(Van Erp, 2005), vibrotactile devices consisting in
fitting vibrating elements to various body locations
allow spatial guidance (Lepelley et al., 2005;
Lepelley, 2008). Among successful applications
with relatively simple displays is Van Erp’s device,
which presented directions from an in-car navigation
system by means of vibrating elements located under
the left and right legs (Van Erp and Van Veen,
2004). In the same vein, we have to mention the
studies of Rochlis and Newman (2000), who
presented directional information during simulated
214
Lepelley M., Lejeune L., Thullier F., Faugloire E. and G. Lestienne F. (2010).
TACTILE GUIDANCE OF THE HAND IN A BLIND POINTING TASK: “THE TACTILE COMPASS” - Tactile Compass in a Blind Pointing Task.
In Proceedings of the Third International Conference on Bio-inspired Systems and Signal Processing, pages 214-219
DOI: 10.5220/0002709902140219
Copyright
c
SciTePress
extra-vehicular activity in space by means of
vibrating elements located on the torso and the neck.
More complex displays, consisted in 60 (or more)
vibrating elements covering the entire torso of the
user. These torso displays present not only the left
and right directions, but also map eight or more
external directions in the horizontal plane.
This paper is devoted to investigating the
efficiency of a prototype of vibrotactile device called
the “Tactile Compass” (TC), which is inserted into
an abdominal belt, in perception and identification
of tactile stimuli to convey appropriate spatial
information. More precisely the TC, consisting in a
7x7 matrix of micro-electromechanical vibrators,
provides tactile messages in a dynamic way. The
trajectory and duration of the vibration was designed
to develop tactile semantic encoding prescriptors
such as directional, kinetic and kinesiological ones.
The present study investigated the efficiency of
the TC in guiding the hand during a pointing task.
More specifically, in a blind experiment, the
performances obtained using tactile encoding focus
on the directional prescriptors are compared here to
those obtained using verbal instructions.
2 METHODS
2.1 Apparatus
The vibrotactile device (Caylar Society ©)
consisted in 49 microvibrators (called “pins”) laid
out in a 7x7 matrix (Figure 1), a power unit, a micro-
control unit MIcroship (PIC16F688) and a connector
serial port.
Figure 1: Prototype of the tactile compass (TC) (Caylar
Society ©). Localisation of the centre of the TC was 50
mm above the ombilic.
Each pin consisted of an inertial vibrating
element (VE) activated by micromotors (2 mm in
diameter) based on classic technology. The shape of
the VE was designed to have a conic section with a 1
square millimetre skin contact area. The distance
between each pin was 6 mm. The oscillation
frequency of the pins was 50-60 Hz with a
magnitude of 2 mm. The 7x7 pins were mounted in a
small PVC box with a square area of 63 mm. To
ensure proper vibrational reception in each pin and
to minimize the lateral propagation of the vibration,
each pin was housed in a Plexiglas honey-comb
specifically manufactured for this. The pin was
glued to the cell by means of synthetic latex.
In the inactive state, all the 49 pins were in
contact with the skin. In the active state, the tactile
messages were provided in a dynamic way by the
successive activation of each pin.
The TC was mounted in an adjustable abdominal
belt. The locations of the TC in the belt were also
adjustable so that they could easily be positioned
regardless of the subject’s body form.
2.2 Software of the Micro-Control Unit
(MCU)
During tactile stimulus presentation, the pins were
arranged in sequence to form the desired tactile
pattern. For each pin, the duration (d) of activation
of each pin and the time interval (t) between the
activation of successive pins were specified using
the control library of the MCU (see Figure 3).
2.2.1 Pin Mapping
The MCU and the data collection software were
written entirely in C++.
The MCU software is targeted primarily for use
in an operating system that specifies the following
by means of the control library of the MCU:
-the position of each pin individually in a Cartesian
(x, y) plane (Figure 2 and the upper part of Figure
3), the individual pins being numbered from 1 to 49;
-the temporal characteristics (upper part of figure 3)
of activation: duration (d) and time interval (t)
-pre-specified patterns of pin movement (bottom
part of Figure 3).
TACTILE GUIDANCE OF THE HAND IN A BLIND POINTING TASK: "THE TACTILE COMPASS" - Tactile
Compass in a Blind Pointing Task
215
Figure 2: Schematic diagram of the generation of pin
activation in a Cartesian plane (x, y).
2.2.2 Alpha-Numeric Screen
The alpha-numeric screen displays the control
library that converts tactile prescriptors (directional,
kinetic and kinesiological) provided by the
experimenter into voltages that drive each of the 49
pins.
Figure 3: Overview of the alpha-numeric screen (see
explanations in the text).
The upper part of the screen is an illustration of
the sequential commands used to form the desired
“tactile” character “9”. In this example the duration
of the drawing of “9” was 1500 ms with d=100ms
and t=0ms.
The bottom part of the screen depicts 40 pre-
specified tactile patterns used as tactile semantic
encoding tactile prescriptor:
-Directional prescriptor: Upwards, Downwards,
Right, Left, Forwards, Backwards…
-Kinetic prescriptor: Stop, Accelerate, Brake…
-Kinesiological prescriptor: Ascend, Descend,
Turn...
2.3 Protocol
2.3.1 Subjects
Twelve young subjects (6 males and 6 females)
(29.23 ± 3.2 years) were tested in this study. They
were all right-handed as ascertained according to the
hand they preferred to use when writing and eating.
They were recruited among students of Caen
University. They had never participated in a tactile
perception task.
All subjects gave informed consent to take part
in the experiment according to the institutional
procedures of our university.
2.3.2 Task and Experimental Procedure
During a familiarization phase, the participants had
to point from the central target towards one of the
four other targets, each target corresponding to one
of the six directions (upwards, downwards, left,
right, forwards and backwards) (Figure 4C and 5)
located either in the frontal plane or in the horizontal
plane (Figures 4A and 4B). In a test phase, the
directional targets were removed and the participants
had to point at the target, the direction of which was
provided by either tactile instructions (on the
abdomen) or verbal instructions. Both types of
instruction (tactile and verbal) and types of plane
(horizontal and frontal) were counterbalanced across
participants.
Figure 4: Schematisation of the two planes of the pointing
task (A: frontal plane, B: horizontal plane) and of the
positions of the targets (C).
Kinematic data were analysed to measure the
precision and the velocity of the pointing task. For
this purpose a motion capture system (Vicon MX-
40, Oxford Metrics Ltd) including 4 cameras with
BIOSIGNALS 2010 - International Conference on Bio-inspired Systems and Signal Processing
216
Figure 5: Experimental setup (on left), display of the 9
markers during pointing to the central target.
100 Hz visual sampling frequency allowed us to
record the 3D position of 9 markers located on the
fingertip, wrist, elbow, shoulders, hips and head
(Figure 5).
3 RESULTS
The results support the proven effectiveness of TC
in tactile guidance of the hand during a blind
pointing task. The level of performance halfway
between chance and perfect performance has
therefore been clearly reached. Indeed, the
participants identified 93.27 % of tactile directions
(Figure 5). Moreover, no significant effect of type of
instruction (verbal versus tactile) on the number of
correctly identified directions (χ2 = 1.18; ddl = 1;
ns) was noticed. Similarly, no significant interaction
between type of instruction and precision of pointing
(F(1,10) = 0,96 ; ns), nor between type of instruction
and variability of pointing (F(1,10) = 1,24 ; ns) were
noticed.
In addition, no significant effect of type of
instruction was found in the kinematic data (mean
velocity (F(1,10) = 0.28; ns); planes of motion
(F(1,10) = 0.002; ns). Consequently, the prototype
Figure 6: Percentage of correctly identified directions for
verbal and tactile instructions.
provides information about direction that is at least
as well perceived as verbal instruction, and does so
without disturbing the spatio-temporal organization
of the movement.
It is of interest to notice in Figure 7 that pointing
was faster in the horizontal plane than in the frontal
plane (F(1,10) = 6.44; p < 0.05).
Figure 7: Mean velocity in mm.s
-1
for horizontal (blue)
and frontal (red) planes during pointing.
Furthermore, the precision of pointing was better
in the frontal plane than in the horizontal plane
(F(1,10) = 6.59; p < 0.05). The mean deviations
were greater in the horizontal plane than the frontal
plane (F(1,10) = 39.52; p < 0.05). These results are
in agreement with those obtained by Ghafouri and
Lestienne (2006), which showed an under-
representation of horizontal egocentric space in the
internal representation of space.
Figure 8: 3D constant error (left) and mean deviations
(right) for horizontal (blue) and frontal (red) planes during
pointing in mm.
4 DISCUSSION
Overall, the results of this study reveal the efficiency
of providing tactile messages in a dynamic way.
TACTILE GUIDANCE OF THE HAND IN A BLIND POINTING TASK: "THE TACTILE COMPASS" - Tactile
Compass in a Blind Pointing Task
217
Indeed, the present experiment confirms that
subjects are able to indicate an external spatial
orientation that matches successive vibrotactile point
stimulus on a small surface (63x63 mm
2
) of the skin
of the abdomen. The tactile discrimination between
the six directional prescriptors results in remarkably
robust similarities across the 12 subjects. In other
words, this study showed that a localized vibration
on an abdominal belt could easily and accurately be
interpreted as a direction in the horizontal and
frontal plane and can be used to signal multi-
directional motion and to guide hand movement.
Through these initial empirical findings, the results
confirm that the skin surface of the human body
(hand, arm, leg, neck, torso, abdomen) is very
sensitive to temporal aspects of vibrotactile
stimulation (Gurfinkel et al., 1993; Van Erp and
Werkhoven, 2004; Lepelley et al., 2005; Lepelley,
2008; Asseman et al., 2008).
The primary goal of the present study was to
gain insight into the characteristics of generation of
the tactile message. Consequently, we tested the
appropriate spatial and temporal resolution of the
tactile pattern (see appendix) to generate the tactile
semantic encoding:
a)-distance between each pin: 6 mm,
b)-vibration frequency: 50-60HZ,
c)-activation time (d) of the pin: 200 ms,
d)-step time (t) between the activation of successive
pins: 150 ms.
It is of interest to stress that the precision of
pointing was better in the frontal plane than in the
horizontal plane. These results are in agreement with
those obtained by Ghafouri and Lestienne (2006)
during a pointing task involving pointing to virtual
targets located in the sagittal, frontal and horizontal
planes. Errors were minimal for the sagittal and
frontal planes and maximal for horizontal plane.
These disparities in errors were considerably
reduced when subjects pointed using a visual guide.
These findings imply that different planes are
centrally represented, and are characterized, by
different errors when subjects use a body-centered
frame for performing the blind pointing and suggest
that the representation of peripersonal space may be
anisotropic. This study reveals the high stability of
the egocentric reference system (Lestienne and
Gurfinkel, 1988). This was consistent with the
finding of our previous works performed in
microgravity, dedicated to the perception and
interpretation of complex tactile stimuli (Gurfinkel
et al, 1993). Based on these findings we can assume
that tactile patterns encoding prescriptors are
perceived as a “local sign” taken into account in the
information about the configuration of the body
(Gurfinkel et al., 1994, and the appendix).
5 CONCLUSIONS
On the basis of the perceptual process of orientation
in 3D space through the use of tactile cues, the
results of this experiment establish the fact that
tactile information transmitted via our TC is
involved successfully in the processes of tactile
guidance of the hand in tridimensional space.
One of major advantages foreseen in using the
TC is its portability. This compact and lightweight
TC can be comfortably incorporated in the user’s
clothing without impairing movement.
On the basis of the positive experiences with the
TC, our laboratory has recently been involved in an
investigation into the potential of the TC in land
navigation.
The key to successful implementation of the TC
lies in the ability to convey effective directional
information that provided the user with a major
safety enhancement: “moving with hand and eye
free”.
In future work, the TC presents a number of
promising functional opportunities for use in clinical
and rehabilitation applications. These include
assistance in balancing and in coordinating
movements, combining the TC and virtual reality
techniques.
ACKNOWLEDGEMENTS
The authors gratefully acknowledge Douglas
McCarthy for linguistic corrections and would like
to thank subject volunteers.
This research is founded by two projects:
-Ministère de l’Enseignement Supérieur et de la
Recherche (MESR): Action Concertées Incitative
(ACI 2004-2007) « Terrain, Techniques &
Théories » Langage tactile et orientation du corps
dans l’espace: une assistance technique au contrôle
de l’orientation dans l’espace tridimensionnel dans
le cas de déficits sensorimoteurs.
-Ministère de la Défense: Délégation Générale
pour l’Armement (DGA) Grant: REI (Exploratory
and Innovative Research) No 2008 34 0030.
BIOSIGNALS 2010 - International Conference on Bio-inspired Systems and Signal Processing
218
REFERENCES
Asseman, F., Brostein, A.M., Gresty, M.A., 2008,
Guidance of visual direction by topographical
vibrotacile cues on the torso, Exp Brain Res, 186,
283-292.
Bach-y-Rita, P., 1982, Sensory substitution, volume
transmission and rehabilitation: emerging concepts,
In Illis L-S (Ed), Neurological rehabilitation,
Blackwell, Oxford, 2nd edition, pp 457-468.
Bach-y-Rita, P., 2004, Tactile sensory substitution, Stud.
Ann.NY Acad. Sci., 1013, 83-91.
Bliss, J.C., Kather, M.N., Rogers, C.H., Shepard, R.P.,
1970, Optical-to-tactile image conversion for the
blind, IEEE Tans MMS, 11, 58-65.
Caffara, P., Mazzucchi, A., Parma, M., 1976, Osservazioni
sulla percezionne cutanea degli stimoli tattili figurati
nell uomo in condizioni normali, Boll Soc Ital Biol
Sperim. 52, 2092-2095.
Ghafouri, M., Lestienne, F.G., 2006, Contribution of
reference frames for movement planning in
peripersonal space representation, Exp Brain Res,
169, 24-36.
Gurfinkel, V.S., Lestienne, F., Levik, Yu S. Popov, K.E.,
1993, Egocentric references and human spatial
orientation in microgravity, I. Perception of complex
tactile stimuli. Exp Brain Res, 95(2), 339-342.
Jagacinski, R.J.,Miller, D.P., Gilson, R.D., 1979, A
comparison of kinesthetic-tactual and visual displays
in a critical tracking task, Human Factors, 21, 65-71.
Johansson, R.S., Landström U., Lundström R.J.I., 1982,
Responses of mechanoreceptive afferent units in the
glabrous skin of the human hand to sinusoidal skin
displacements, Brain Res., 244, 17-25.
Johnson, H.O., Phillips, J.R, 1981, Tactile spatial
resolution: I two-point discrimination, gap detection,
grating resolution and letter recognition, J
Neurophysiol, 46, 1177-1191.
Lestienne, F.G., Gurfinkel, V.S., 1988, Postural control in
weightlessness: a dual process underlying adaptation
to an unusual environment, Trends Neurosci 11: 359-
63.
Lepelley, M.C., 2008, Production du geste dans l’espace
tridimensionnel : du mouvement dansé au guidage
tactile du mouvement de pointage, PdD thesis,
Universty of Caen Basse-Normandie.
Lepelley M.C., El Idrissi H., Thullier F., Lestienne F.,
2005, Body orientation and identification of complex
tactile stimuli, Proceedings for 5
th
edition the Progress
in Motor Control, Pennsylvania State University.
Natsoulas, T., Dubanoski, R.A., 1964, Inferring the locus
and orientation of the perceiver from responses to
stimulation of the skin, Am J Psychol, 77, 281-285.
Parsons, L.M., Shimojo, S., 1987, Perceived spatial
orientation of cutaneous patterns on surfaces of the
human body in various positions, J Exp Psychol, 13,
488-504.
Phillips, J.R., 1988, Spatial pattern representation and
transformation in monkey somatosensory cortex,
Trends Neurosci, 11, 356-357.
Rochlis J.L., Newman D.J., 2000, A tactile display for
International Space Station (ISS) Extra Vehicular
Activity (EVA), Aviation, Space, and environmental
Medicine, 71, 571-578.
Van Erp J.B.F., 2005, Presenting directions with a
vibrotactile torso display, Ergonomics, 48 (3), 302–
313.
Van Erp J.B.F., Van Veen H.A.H.C., 2004, Vibrotactile
in-vehicle navigation system, Human Factors, 7, 247-
256.
Van Erp J.B.F., Werkhoven, P.W., 2004, Vibro-tactile and
visual asynchronies: Sensitivity and consistency,
Perception, 33, 103-111.
APPENDIX
Perception and interpretation of cutaneous stimuli
applied to the same tactile receptive field for
different posture and different orientation of the
body segment in question can yield valuable data
about the perceptual and gnostic functions during the
learning process in order to achieve tactile shape
recognition. The studies were conducted on 17
subjects. Two postures (see figure 9) were studied:
the vertical (1, 2) and horizontal positions (3, 4).
For each posture two conditions were examined:
with the right leg extended (1, 3) and during flexion
of the right leg at 90° between hip and knee joints
(2, 4).
In each situation, one of the 4 tactile stimuli was
presented: digits and simple geometric shapes (see
Figure 9). The TC was fixed on the frontal surface of
the right tight with Velcro strips. The results show
that the task of identification of complex tactile
stimuli was not affected by modification of posture
relative to the vertical gravitational body. However,
an increase in the frequency of errors was observed
when the leg was flexed: the part of the figure
closest the knee was perceived as being on top. It is
important to note that with more or less intensive
training, we observed a related improvement in the
task.
Information about the configuration of the body
part is taken into account during the stages of
processing the tactile signal.
Figure 9: Schematisation of the tactile stimuli (left) and
the postural conditions (right) (from Lepelley et al., 2005).
TACTILE GUIDANCE OF THE HAND IN A BLIND POINTING TASK: "THE TACTILE COMPASS" - Tactile
Compass in a Blind Pointing Task
219