![](bg3.png)
3 MATERIAL AND METHODS
The process control system is implanted through a
system of distributed agents based on the
specifications of the FIPA (FIPA, 2003), which
establishes the basic architecture for the
development of those agents.
The Foundation for Intelligent Physical Agents
(FIPA) is an international organisation dedicated to
promoting the industry of intelligent agents by
openly developing specifications that support
interoperability among agents and agent-based
systems. The primary focus of this FIPA Abstract
Architecture is to create semantically meaningful
message exchanges between agents that may be
using different messaging transports, different Agent
Communication Languages (ACLs), or different
content languages. Agent management provides the
normative framework within which FIPA agents
exist and operate. It establishes the logical reference
model for the creation, registration, location,
communication, migration and retirement of agents.
Each installation disposes of a component that
registers all the agents in the network with the
interface of the agents, Agent Management System.
To this effect, it uses the Localisation Service, which
makes it possible to locate the other agents during
the initiation and to detect the possible decrease or
increase of new agents by means of triggers. In case
of an increase, the new agent is registered; in case of
a decrease, it is eliminated from the Agent
Management System. The agents were implemented
with FIPA-OS components, and the communications
are established with a subset of ACL language.
FIPA-OS (FIPA-OS, 2003) is a component-
orientated toolkit for the construction of FIPA
compliant Agents through mandatory components
(i.e. components required by all the executable
FIPA-OS Agents), components with switchable
implementations, and optional components (i.e.
components that a FIPA-OS Agent can optionally
use). FIPA-OS tasks generally encapsulate some
functionality that forms a logical work unit (i.e.
search the DF, conduct a negotiation with another
Agent, or wait for a specific lapse of time). Tasks
can also be composed of sub-tasks so as to enable
more complex “units of work” that are sub-divisible
into smaller chunks of functionality. Tasks are based
upon event-based processing, where events are
delivered by the dynamic invocation of various
methods.
Each agent includes a KBS based on the
information that is collected either locally (from the
computer that executes the agent) or by
communicating with the other agents. The rules
system that is integrated into the agents was made
with the JESS expert system shell and scripting
language (JESS, 2003). Jess can be used in two
overlapping ways. Firstly, it can be a rule engine, a
special type of program that very efficiently applies
rules to data. In this case, they are said to constitute
an expert system. Among the newest applications of
expert systems are the reasoning part of intelligent
agents. But the Jess language is also a general-
purpose programming language, and furthermore, it
can directly access all Java classes and libraries
(Friedman-Hill, 2003).
Jess's rule engine uses an improved form of the
Rete algorithm (Forgy, 1982) to match rules against
the knowledge base. Jess is optimised for speed at
the cost of space. Jess is different than some Rete-
based systems in that it includes both a kind of
backwards chaining and a construct called defquery
which allows the user to make direct queries of the
knowledge base.
The implementation of the rules associated to the
task management was based on the characterisation
of the tasks (resources and temporal requirements
that are necessary for their execution) and on the
information that is compiled by the agents network.
3.1 Tasks Control
The Tasks Manager includes a priorities scheme that
is dynamically reconfigurable according to the needs
of medical KBS, patient state, physician requests,
etc. Possibilities like a task re-launch after the
detection of a missing agent or normal
disconnection, duplicated launch in extreme priority
conditions, and others, were implemented for some
concrete image processing tasks. The fact that each
agent is linked to a rules system that determines all
the executable actions, makes this implementation
possible.
The election mechanism of the various candidates
is very much influenced by the priority of the
processes. From less to more priority, the processes
were simplified and divided into 3 levels (0: low, 1:
normal and 2: high), which, according to the tests,
provide sufficient efficiency. We have followed the
general rule that a superior priority process that
surges at a given moment can cancel a lower priority
process that is already being executed. The selection
of the computer for a task of high priority consider
the total capacity of which it disposes, without
considering the current executions except if the alter
are also marked as having high priority. This process
will be re-launched after a new selection between
the remaining candidates. After a first candidates
selection, based on the necessary memory for the
process execution (physical and virtual), we pass on
to a successive refinement of the set of candidates,
ICINCO 2004 - INTELLIGENT CONTROL SYSTEMS AND OPTIMIZATION
72