eral combinations of output modalities to notify the
driver about the road situation ahead. The analysis
of qualitative and quantitative data shows that speech
messages were effective in conveying the warning in-
formation to drivers. We also found that visual warn-
ings are generally considered distracting and cause
a higher workload. Still, participants see the visual
warnings as a good backup to voice warnings. Voice
interaction with a car seems to be a novelty to elderly
drivers. Within a formative approach, we recommend
recruiting new participants for each experiment while
maintaining the sample size between 6 and 8 individ-
uals in order to control the learning effect with the
task and discover new design opportunities. Regard-
ing standardized questionnaires, we found the usage
of TLX and UEQ to gather information on the po-
tential areas of improvement suitable to our domain.
These findings, we believe, should be useful for prac-
titioners and researchers involved in the design and
development of features for semi-autonomous vehi-
cles, such as voice-based interfaces, chat-bots, or road
sign assistance.
ACKNOWLEDGEMENTS
The authors acknowledge the support by the project
VIADUCT under the reference 7982 funded by Ser-
vice Public de Wallonie (SPW), Belgium.
REFERENCES
Baldwin, C. L. and Lewis, B. A. (2014). Perceived urgency
mapping across modalities within a driving context.
Applied Ergonomics, 45(5):1270–1277.
Boelhouwer, A., van Dijk, J., and Martens, M. H. (2019).
Turmoil Behind the Automated Wheel. In HCI in
Mobility, Transport, and Automotive Systems, volume
11596, pages 3–25, Cham. Springer.
Cao, Y., Castronovo, S., Mahr, A., and M
¨
uller, C. (2009).
On timing and modality choice with local danger
warnings for drivers. In Proceedings of the 1st In-
ternational Conference on Automotive User Inter-
faces and Interactive Vehicular Applications, Auto-
motiveUI ’09, page 75–78, New York, NY, USA. As-
sociation for Computing Machinery.
Debernard, S., Chauvin, C., Pokam, R., and Langlois, S.
(2016). Designing human-machine interface for au-
tonomous vehicles. IFAC-PapersOnLine, 49(19):609
– 614. 13th IFAC Symposium on Analysis, Design,
and Evaluation ofHuman-Machine Systems HMS
2016.
Eurostat (2017). A look at the lives of the elderly in the eu
today.
Frison, A.-K., Wintersberger, P., Liu, T., and Riener, A.
(2019). Why do you like to drive automated? a
context-dependent analysis of highly automated driv-
ing to elaborate requirements for intelligent user inter-
faces. In Proceedings of the 24th International Con-
ference on Intelligent User Interfaces, IUI ’19, page
528–537, New York, NY, USA. Association for Com-
puting Machinery.
Gerber, M. A., Schroeter, R., and Vehns, J. (2019). A video-
based automated driving simulator for automotive ui
prototyping, ux and behaviour research. In Proceed-
ings of the 11th International Conference on Automo-
tive User Interfaces and Interactive Vehicular Appli-
cations, AutomotiveUI ’19, page 14–23, New York,
NY, USA. Association for Computing Machinery.
Hart, S. G. (2006). Nasa-task load index (nasa-tlx); 20
years later. Proceedings of the Human Factors and
Ergonomics Society Annual Meeting, 50(9):904–908.
Huang, G. and Pitts, B. (2020). Age-related differences
in takeover request modality preferences and atten-
tion allocation during semi-autonomous driving. In
Gao, Q. and Zhou, J., editors, Human Aspects of IT for
the Aged Population. Technologies, Design and User
Experience, pages 135–146, Cham. Springer Interna-
tional Publishing.
Insurance Institute for Highway Safety (IIHS) (2020). Ad-
vanced driver assistance.
Kieffer, S. (2017). Ecoval: Ecological validity of cues
and representative design in user experience evalua-
tions. AIS Transactions on Human-Computer Interac-
tion, 9(2):149–172.
Koo, J., Kwac, J., Ju, W., Steinert, M., Leifer, L., and
Nass, C. (2015). Why did my car just do that? Ex-
plaining semi-autonomous driving actions to improve
driver understanding, trust, and performance. Interna-
tional Journal on Interactive Design and Manufactur-
ing, 9(4):269–275.
Krome, S., Holopainen, J., and Greuter, S. (2017). Au-
toplay: Unfolding motivational affordances of au-
tonomous driving. In Automotive User Interfaces,
pages 483–510. Springer.
Kutchek, K. and Jeon, M. (2019). Takeover and handover
requests using non-speech auditory displays in semi-
automated vehicles. In Extended Abstracts of the 2019
CHI Conference on Human Factors in Computing Sys-
tems, CHI EA ’19, New York, NY, USA. Association
for Computing Machinery.
Laugwitz, B., Held, T., and Schrepp, M. (2008). Construc-
tion and Evaluation of a User Experience Question-
naire. HCI and Usability for Education and Work,
5298:63–76.
Luoma, J. and R
¨
am
¨
a, P. (2001). Comprehension of pic-
tograms for variable message signs. Traffic Engineer-
ing & Control, 42(2):53–58.
Martelaro, N. and Ju, W. (2017). WoZ Way: Enabling Real-
Time Remote Interaction Prototyping & Observation
in On-Road Vehicles. Accepted: Proceedings of the
20th ACM Conference on Computer-Supported Coop-
erative Work & Social Computing, pages 169–182.
Nees, M. A., Helbein, B., and Porter, A. (2016). Speech
Auditory Alerts Promote Memory for Alerted Events
UX Design and Evaluation of Warning Alerts for Semi-autonomous Cars with Elderly Drivers
35