Make Automation G.R.E.A.T (Again)
Julien Donnot
1
, Daniel Hauret
2
, Vincent Tardan
1
and Jérôme Ranc
1
1
French Air Warfare Center, 1061 Avenue du Colonel Rozanoff, Mont-de-Marsan, France
2
Thales AVS, 75-77 Avenue Marcel Dassault, Mérignac, France
Keywords: Military, Aviation, Artificial Intelligence, Ergonomics, Autonomy.
Abstract: Integration of Artificial Intelligence Based Systems in current and future Fighter aircrafts has begun. More
than automation, these systems provide autonomy that will certainly imply changes in fighter pilots cognitive
activity. The current study aimed to test a methodology conceived for the production of ergonomic guidelines.
We collected data related to the use of a path solver with fighter pilots by assessing trust calibration in AIBS.
Analysis of under and overtrust situation led us to formulate relevant ergonomic principles for the tested AIBS
but relevant too for different AIBS.
1 INTRODUCTION
The objective of the current work was to test a method
designed for the evaluation of human trust behaviors
in Artificial Intelligence Based System (AIBS). The
French Air Force needs to develop the capability to
evaluate usability of AIBS employed on board of a
fighter aircraft. Trust in autonomy can be considered
as a key factor for usability. As French Air Force has
to be prepared for forthcoming conflicts, conception
of future air combat systems implies to anticipate
ergonomics issues, especially in terms of decision-
making, workload and errors in fighter aircraft
cockpits. In the field of prospective ergonomics
(Robert & Brangier, 2009), Brangier and Robert
(2014) point out the difficulty to represent future
activity related to a system that does not yet exist.
Considering that the main characteristic of future
fighter aircrafts will be the obligation for pilots to
collaborate with an AIBS (Lyons, Sycara, Lewis &
Capiola, 2021), there is a real need to think up this
future collaborating activity.
Lyons et al. (2018) warned about the specificity
of trust in future autonomy, including AIBS, in the
field of military aviation. Leading studies with real
operators, with real tools and real consequences (R3
concept) appears as the most relevant. In the field of
military aviation, real operators are fighter pilots, real
tools are fighter aircraft (Rafale) and real
consequences appear in a tactical environment. Too
few studies reported knowledge about the French
fighter pilot activity. Amalberti (1996) touched on
some specific features of this activity, Guérin,
Chauvin, Leroy, and Coppin (2013) adapted a
Hierarchical Task Analysis method to one air operation
and Hauret (2010) was the first to be interested in pilot
collaboration with an artificial agent.
To define what would be the collaborating activity
in a future fighter cockpit, ergonomists need to assure
usability of human machine interfaces. Bastien and
Scapin (1995) described a set of criteria designed for
conception guidance. These guidelines were thought
to conceive human-computer interfaces. Given that
functions performed by AIBS are and will be more
complex and sometimes innovative, AIBS conception
guidelines deserve to be considered. In the current
study, authors focused on trust as a critical factor for
usability of AIBS on board a fighter aircraft. Then,
the objective of the study was to produce conception
guidelines to increase usability by building pilot’s
trust in AIBS.
Trust is a complex concept depending on
individual, organizational and cultural context (Lee &
See, 2004) but we choose to focus on its calibration
in the current study by considering the lack and the
excess of trust in a specific AIBS. A large number of
methods and metrics can drive analysis of trust levels
(Hoff & Bashir, 2015). In order to assess usability in
relation with trust, pilots’ behaviors prevailed over
pilots’ feelings. Therefore, we develop a method
immersing operational fighter pilots in a simulated
combat air mission with an operating AIBS.
Experimental objectives were 1) to identify causes of
observed trust levels leading to understand pilots uses