hedonic (e.g. Perceived Enjoyment) or social aspects
(e.g. Subjective Norm), etc.
However, the acceptability of technologies based
on forms of Artificial Intelligence (AI), such as AV,
seems to encounter the issue of trust (Wintersberger
et al., 2019) Trust is not tackled frontally in
acceptability reference models such as the TAM
(Bastien & Scapin, 1993 ; Politis et al., 2019),
UTAUT (Venkatesh et al., 2012) or model of Nielsen
(1993). However, the link between trust and
acceptability has been studied for a long time and the
emergence of AI has led to the enrichment of models
(Hegner, Beldad, & Brunswick, 2019). Some
determinants of trust are close to those of
acceptability. Starting with the attitude that is
associated with both acceptability (Davis &
Venkatesh, 1996) and trust (Politis et al., 2018 ;
Wintersberger et al., 2019).
4 A POINT OF VIEW ON TRUST
ATTRIBUTION
On the same principle as acceptability, trust in a
system is determined by different factors. There are
many definitions of trust in a person (Rajaonah,
2006). A fairly general description would be to
associate trust with "expectations, assumptions or
beliefs about the likelihood that another person's
future actions will be beneficial, favourable or at
least not detrimental to his or her interests."
(Robinson, 1996). This prognosis is based on clues of
attributes, such as competence (Degenne, 2009 ;
Karsenty, 2015) or reliability (Payre, Cestac &
Delhomme, 2014).
These attributes can be found in the questionnaire
proposed by Jian, Bisantz and Drury (2000) to
measure trust in a system. In their model, they
differentiated non-confidence factors (e.g.
misleading, lack of transparency...) and confidence
factors (e.g. reliable, understandable). The
determinants of trust in a system (Henger et al., 2019
; Rajaonah, 2006) are close to those of trust in a third
party, especially for reliability that is fundamental for
AV (Payre et al., 2014). Reliability can be assessed
over the long term, which introduces a notion of
familiarity that is favourable to trust (Rajaonah,
2006). In this case it is possible to assign a level of
confidence in a target (person, group of persons,
object or type of object) based on observations made
over the course of the experiences with it. However,
some situations do not have the support of recurrent
experiences. This is the case, for example aboard a
taxi in a foreign country, it is necessary to obtain
quickly clues on the driver’s abilities to provide the
desired result. For this it is possible to use action
schemata constructed on the basis of road
experiences, and which allow the user to check
whether the actions observed are compliant. If this is
the case, trust can be established. These schemes are
bricks of mental representation, but also of situational
awareness.
5 PRINCIPLES OF AN
EMPATHETIC AND
COGNITIVE INTERFACE
The activities that can be carried out on board an AV
will distract the user's attention from the road. The
user's cognitive support covers two important aspects
of the driving situation: road situation and
autonomous driving. The road situation corresponds
to the flow of information available in the
environment that allows the user to understand the
vehicle's behaviours (e.g. traffic, pedestrian presence,
signage, weather, etc.). Autonomous driving refers to
driving actions developed from information taken by
the AV in the environment. The treatments operated
by the AV are not very visible to the user given their
speed and complexity. On the other hand, it is
possible to make visible certain "goals" (e.g. increase
speed, anticipate a traffic jam) and share some of the
environmental information processed by the AV. This
information is useful for passengers to understand
how the AV works but also to support their
representation of the situation. It is possible to
communicate information symbolically or verbally
through different sensory channels: visual, audio,
haptic...
There is a lot of information available about the
AV and the road situation. Design must respect a
certain minimalism (Bastien & Scapin, 1993 ; Maeda,
2006) to avoid cognitive overload. The choice of
information and sensory channels is an important
issue, which should not compete too much with the
activities of the passenger, at the risk of questioning
the interest of the AV. This is where the empathetic
nature of the interface comes into play. This empathy
is ideally bi-directional in the same way as a
communication situation: the user needs to
understand how the AV works; the AV needs to
"understand" the user's status to adjust their level of
information. According to the iterative design
principle, the first version of the interface provides a
standard level of information. This level of
information will be optimised in a second phase,
based on user feedback and future measurement of
the passenger's cognitive and emotional states. (see
Figure 1 : Empathic Module).