front of the TV, holding his smartphone. The living room is
equipped with sensors, which can catch sound/noise in the
air, time, temperature, status of the window (open/close) and
of the radio and TV (on/off), and the current activity of the
user, and with effectors, acting and controlling windows, radio
and TV and also the execution of digital services that may be
visualized on communication devices, as for instance the TV.
The smartphone allows Jim to remotely control the TV.
Moreover, the house is equipped with several interaction
devices through which the SHE communicates with the user
by implementing different interaction metaphors. Examples of
such devices, employed for controlling the house appliances,
are the touch screens on the fridge or on mirrors, the
smartphone that the users usually brings with him and a
socially intelligent robot that is able to move around in the
home and to engage natural language dialogue with its
inhabitants.
3.1 Sensor Agents
Sensor Agents are in charge of controlling a set of
sensors that are suitably placed in the environment
for providing information about context parameters
and features (e.g. temperature, light level, humidity,
etc.), such as meters to sense physical and chemical
parameters, microphones and cameras to catch what
happens, indicators of the status of various kinds of
electric and/or mechanical devices. The values
gathered by the physical sensors are sent in real-time
to the reasoning behavior of the associated SA,
which uses abstraction to strip off details that are
known but useless for the specific current tasks and
objectives. For instance, the SA providing
information about temperature will abstract the
centigrade value into a higher level representation
such as “warm”, “cold”, and so on. This abstraction
process may be done according to the observed
specific user’s needs and preferences (e.g. the same
temperature might be cold for a user but acceptable
for another). For instance, let us denote the fact that
the user Y is cold in a given situation X with
cold(X,Y). This fact can be derived from the specific
temperature using a rule of the form:
cold(X,Y) :- temperature(X,T), T<18, user(Y), present(X,Y),
jim(Y).
(it is cold for user Jim if he is present in a situation in
which the temperature is lower than 18 degrees). In
turn, the above rule can be directly provided by an
expert (or by the user himself), or can be learned (and
possibly later refined) directly from observation of
user interaction (Ferilli et al., 2005).
3.2 The Butler Agent
The Butler Agent recognizes user goals starting from
percepts received by SAs and composes a smart
service corresponding to a workflow that integrates
elementary services according to the particular
situation.
The reasoning of this agents mainly involves
deduction, to draw explicit information that is
hidden in the data, and abduction, to be able to
sensibly proceed even in situations in which part of
the data are missing or otherwise unknown.
However, in some cases, it may also use abstraction,
which is performed at a higher level than in SAs.
Each observation of a specific situation can be
formalized using a conjunctive logic formula under
the Closed World Assumption (what is not explicitly
stated is assumed to be false), described as a snapshot
at a given time. A model, on the other hand, consists
of a set of Horn clauses whose heads describe the
target concepts and whose bodies describe the pre-
conditions for those targets to be detected. For
instance, the following model might be available:
improveHealth(X) :- present(X,Y), user(Y), has_fever(Y).
improveHealth(X):-
present(X,Y), user(Y), has_headache(Y), cold(X,Y).
improveHealth(X) :- present(X,Y), user(Y), has_flu(Y).
improveMind(X) :- present(X,Y), user(Y), sad(Y).
improveMind(X) :- present(X,Y), user(Y), bored(Y).
On the other hand, a sample observation might be:
morning(t0),closedWindow(t0),present(t0,j), jim(j), user(j),
temperature(t0,14), has_flu(j), bored(j).
Reasoning infers that Jim is cold: cold(t0,j). Being all
the preconditions of the first and fourth rules in the
model satisfied by this situation for X = t0 and Y =
jim, the user goals improveHealth and improveMind
are recognized for Jim at time t0, which may cause
activation of suitable workflows aimed at attaining
those results.
The BA reasons not only on goals but also on
workflows. Indeed, once a goal is triggered, it selects
the appropriate workflow by performing a semantic
matchmaking between the semantic IOPE description
of the user's high-level goal and the semantic profiles
of all the workflows available in the knowledge base
of the system (W3C, 2004). As a result, this process
will produce from zero to n workflows that are
semantically consistent with the goal, ranked in order
of semantic similarity with the goal.
For instance, as shown in Figure 1, the semantic
matchmaking process leads to two different
workflows associated, respectively, to the two high-
level goals improveHealth and improveMind
previously recognized. The semantic matchmaking
process starts from these goals and leads to the
desired workflow.
A MULTIAGENT SYSTEM SUPPORTING SITUATION AWARE INTERACTION WITH A SMART ENVIRONMENT
69