argue that p, or that some action φ should be done
1
,
in order that the other agent will come to agreement.
The collective goal is to resolve whether p is true or φ
should be done. With regard to individual goals, per-
suasion is asymmetric: the persuader wishes to con-
vince the persuadee, whereas the persuadee wishes to
explore the possibility that its current opinion should
be revised in the light of information known to the
persuader: the persuadee is interested in what is true,
whether it be p or ¬p. A different case of persuasion
is what Walton terms a dispute. In this case the per-
suadee also wishes to convince the other agent that its
own original position is correct, so that its individual
goal is now that the other should believe ¬p or that
φ should not be done: we will not consider disputes
further in this paper. Deliberation is generally held
to concern actions: initially both agents are unsure
whether or not to φ, and individually and collectively
they wish to come to agreement as to whether or not
to φ. In the next section we will explore the distinc-
tions further, with a view to precisely characterising
persuasion dialogues in particular.
2 DISTINGUISHING THE
DIALOGUE TYPES
Instead of distinguishing between actions and propo-
sitions, we believe that the correct distinction is re-
lated to directions of fit, a distinction made by Searle
(Searle, 2003). Searle distinguishes theoretical rea-
soning, reasoning about what is the case, from prac-
tical reasoning, reasoning about what it is desired to
be the case, and what should be done to realise those
desires. In the first case it is necessary to fit one’s be-
liefs to the world, whereas in the second the idea is to
make the world fit one’s desires, in so far as one has
the capacity to do so. In these terms, inquiry repre-
sents an attempt to better fit the beliefs of the agents
to the world, and deliberation how best to make the
world fit the collective desires of the agents. Persua-
sion can be about either. Note, however, that when
we have two (or more) participating agents, we have
two (or more), probably different, sets of desires to
consider. In deliberation no set of desires should be
given pre-eminence, but rather the group as a whole
needs to come to an agreement on what desires they
will adopt collectively. In contrast, as discussed in
(Bench-Capon, 2002), in persuasion it is the desires
1
There has been some disagreement as to whether per-
suasion can be over actions. Walton in (Walton, 1998)
seems to suggest not. None the less it is clear that we are,
in ordinary language, fully prepared to speak of persuading
someone to do something.
of the persuadee that matter: a persuadee is fully en-
titled to use its own preferences to assess any propo-
sition or proposal, without any need to consider what
the persuader desires. The construction of a set of col-
lective desires introduces an additional order of com-
plication, and puts deliberation beyond the scope of
this paper. Therefore in what follows we will focus
exclusively on persuasion.
2.1 Definitions
An example, including example use of our notation,
is given in Section 3. The reader might find it help-
ful to refer to this in conjunction with the following
definitions for concrete illustrations of their use.
The knowledge bases of agents can be partitioned
into factual elements, on which agents should agree
2
,
used when the direction of fit is from world to beliefs,
and preference elements, which represent their own
individual desires, tastes and aspirations, and are used
when the direction of fit is from desires to the world.
Thus, the preference elements represent the way the
agent evaluates possible futures to determine what it
wishes to bring about, and how it evaluates objects
and situations for value judgements such as best car
and acceptable restaurant.
Definition 1. Let AG denote a set of agents, each of
which, Ag ∈ AG, has a knowledge base KB
Ag
. KB
Ag
is partitioned into factual elements denoted by KB
Ag
F
and preference information denoted by KB
Ag
P
. KB
Ag
F
comprises facts, strict rules and defeasible rules.
KB
Ag
P
comprises rules to determine the utility for Ag
of certain items based on their attributes, and the
weights used by these rules. These preference ele-
ments are defined below.
Agents expand their KB
F
by taking information
from one another, but KB
P
remains fixed throughout
the dialogue. Whereas, because it is intended to fit the
world, KB
F
is objective, KB
P
represents the personal
preferences of an individual agent, and is entirely lo-
cal to the agent concerned. We will use f for factual
propositions, and p
Ag
(to be read as “p is the case for
Ag”) for propositions based on preferences. We will
not represent actions separately, so that p
Ag
may rep-
resent either propositions such as Roux Brothers is an
2
Of course, this does assume that there is a set of propo-
sitions which are objectively true. In practice there may be
room for dispute. None the less we will assume agreement
in judgements (even for matters such as whether a restau-
rant is near, or whether it is of good quality), reserving sub-
jectivity for differences in taste, based on preferences and
choice.
ICAART 2012 - International Conference on Agents and Artificial Intelligence
24