
gets with their index finger guided by sounds. Results
showed that Fitts’ law is a valuable model to evaluate
target-reaching, even in 3D non-visual interfaces with
non-spatialized sonification as a feedback. In a sec-
ond comparative experiment, we used Fitts’ law as a
standardized metric to compare performances of two
sound guidance systems, using the slope of the lin-
ear regression between Movement Time (MT) and In-
dex of Difficulty (ID). Results showed the advantage
of using a non-spatialized sonification that dissociates
the vertical and horizontal information on the position
of the target into two sound streams (within the same
channel) over a non-spatialized sonification that uses
a single metric and sound stream for both dimensions.
Here, we have demonstrated the utility of Fitts’
law in comparing the performance of different sound
guidance systems within the same experimental con-
ditions. While target-reach time, a commonly used
metric for comparing guidance systems, is influenced
by target size and user-target distance, these experi-
mental parameters vary widely across studies. There-
fore, the potential of utilizing Fitts’ law to com-
pare performance across studies using different ex-
perimental settings deserves exploration in future re-
search. As a general practice, we encourage authors
to provide complete regression equations when em-
ploying Fitts’ law.
ACKNOWLEDGEMENTS
This work was supported by the Agence Nationale de
la Recherche (ANR-21CE33-0011-01). It was autho-
rized by the ethical committee CER Grenoble Alpes
(Avis-2018-06-19-1).
This work has been partially supported by ROBO-
TEX 2.0 (Grants ROBOTEX ANR-10-EQPX-44-01
and TIRREX ANR-21-ESRE-0015) funded by the
French program Investissements d’avenir.
REFERENCES
Barrera Machuca, M. D. and Stuerzlinger, W. (2019). The
effect of stereo display deficiencies on virtual hand
pointing. In Proceedings of the 2019 CHI conference
on human factors in computing systems, pages 1–14.
Cha, Y. and Myung, R. (2013). Extended fitts’ law for 3d
pointing tasks using 3d target arrangements. Interna-
tional Journal of Industrial Ergonomics, 43(4):350–
355.
Charoenchaimonkon, E., Janecek, P., Dailey, M. N., and
Suchato, A. (2010). A comparison of audio and tac-
tile displays for non-visual target selection tasks. In
2010 International Conference on User Science and
Engineering (i-USEr), pages 238–243. IEEE.
Clark, L. D., Bhagat, A. B., and Riggs, S. L. (2020). Ex-
tending fitts’ law in three-dimensional virtual environ-
ments with current low-cost virtual reality technol-
ogy. International journal of human-computer stud-
ies, 139:102413.
Drewes, H. (2010). Only one fitts’ law formula please!
In CHI ’10 Extended Abstracts on Human Factors in
Computing Systems, CHI EA ’10, page 2813–2822,
New York, NY, USA. Association for Computing Ma-
chinery.
Drewes, H. (2023). The fitts’ law filter bubble. In Extended
Abstracts of the 2023 CHI Conference on Human Fac-
tors in Computing Systems, CHI EA ’23, New York,
NY, USA. Association for Computing Machinery.
Faul, F., Erdfelder, E., Lang, A.-G., and Buchner, A. (2007).
G* power 3: A flexible statistical power analysis pro-
gram for the social, behavioral, and biomedical sci-
ences. Behavior research methods, 39(2):175–191.
Fitts, P. M. (1954). The information capacity of the human
motor system in controlling the amplitude of move-
ment. Journal of experimental psychology, 47(6):381.
Fitts, P. M. and Radford, B. K. (1966). Information capacity
of discrete motor responses under different cognitive
sets. Journal of Experimental psychology, 71(4):475.
Fons, C., Huet, S., Pellerin, D., Gerber, S., and Graff, C.
(2023). Moving towards and reaching a 3-d target by
embodied guidance: Parsimonious vs explicit sound
metaphors. In International Conference on Human-
Computer Interaction, pages 229–243. Springer.
Gao, Z., Wang, H., Feng, G., and Lv, H. (2022). Explor-
ing sonification mapping strategies for spatial auditory
guidance in immersive virtual environments. ACM
Transactions on Applied Perceptions (TAP).
Guezou-Philippe, A., Huet, S., Pellerin, D., and Graff,
C. (2018). Prototyping and evaluating sensory sub-
stitution devices by spatial immersion in virtual en-
vironments. In VISIGRAPP 2018-13th Interna-
tional Joint Conference on Computer Vision, Imag-
ing and Computer Graphics Theory and Applications.
SCITEPRESS-Science and Technology Publications.
Hild, M. and Cheng, F. (2014). Grasping guidance for vi-
sually impaired persons based on computed visual-
auditory feedback. In 2014 International Conference
on Computer Vision Theory and Applications (VIS-
APP), volume 3, pages 75–82. IEEE.
Hoffmann, E. R. (2013). Which version/variation of fitts’
law? a critique of information-theory models. Journal
of motor behavior, 45(3):205–215.
Hu, X., Song, A., Wei, Z., and Zeng, H. (2022). Stere-
opilot: A wearable target location system for blind
and visually impaired using spatial audio rendering.
IEEE Transactions on Neural Systems and Rehabili-
tation Engineering, 30:1621–1630.
Katz, B. F., Kammoun, S., Parseihian, G., Gutierrez, O.,
Brilhault, A., Auvray, M., Truillet, P., Denis, M.,
Thorpe, S., and Jouffrais, C. (2012). Navig: aug-
mented reality guidance system for the visually im-
paired. Virtual Reality, 16(4):253–269.
Using Fitts’ Law to Compare Sonification Guidance Methods for Target Reaching Without Vision
453