Any element can be viewed from the technical
perspective, or the organisational perspective, or the
personal perspective and may appear differently
depending on how it is viewed. As indicated by
Linstone et al. (1981), the use of the T perspective to
study the technical elements, the use of the O
perspective to study the organisation elements, and
the use of the P perspective to study the individual
elements are vital but by no means adquate. Any
perspective may illuminate any element. It is
inconceivable that a technical element can be
understood without use of the T perspective. But the
O and P perspectives may add important insights.
Similarly, appreciation of an organisation requires
an O perspective, but much can be gained by use of
the T and P perspectives. Linstone et al. concluded
that “most importantly, the different perspectives are
mutually supportive, not mutually exclusive”.
Even though the multiple perspective concept
has been widely appreciated in the inforamtion
security community, for example the McCumber
Cube model of information systems security (1991)
presents security measures in three layers: technical,
policy and practice, and huamn factors. However it
is rare to see any approach or a real application that
has implemented the multiple perspective concept in
security risk assessment. Most of the existing secuity
risk assessment tools are technical focused with little
or no consideration of organisational and personal
factors in literature.
3 INTERACTIONS BETWEEN
TECHNICAL APPROACHES,
ORGANISATIONAL ISSUES
AND HUMAN FACTORS
It is not surprising that the main features of
computer network security practices adopted by an
organisation are technical. Any tangible and
intangible assets must be protected by technical
controls that depend on technical approaches. The
typical technial approaches include: (a)
Identification and authentication. These controls
prevent unauthorized personnel from entering the
computer system. Security controls include
passwords and firewalls. (b) Logical access control.
These controls ensure that sensitive information
assets and information systems are only accessed by
authorized individuals. Security controls include
access policy and technical mechanisms such as
encryption and access control lists. (c) Audit trails.
These controls ensure the users are accountable for
their actions and that indications of system
instability or security problems are identified and
tracked. Security controls include audit events and
review of audit trails. However, these technical
mechanisms do not offer much protection when
employees have a right of access but use it for
malicious pruposes or make human errors because of
‘carelessness’ or ‘lack of awareness’. There are
many inidcations that the technical secuirty
measures are not very successful. Most analyses
suggest technical solutions are not enough, and ‘the
human factor’ is often the cause of the downfall,
beacause organisational structures and cultures allow
the human errors or malicious actions happen.
Many organisations are exploring organisational
policies that are limited to training staff in the
security procedures they are expected to follow.
These appear to focus on what not do, i.e. how to
avoid creating a security risk. Very little is available
about how to create a security culture in which
employees take positive responsibility for creating a
climate of security and trust.
Many problems can occur if a balance among the
three perspectives can not be achieved. Employees
may work around the existing security system to
meet their work demands because of the poor
usability of the technical system such as too rigid or
inappropriate in an emerging situation, or a poor
match to their skills and organisational cultures. All
these inappropriate interactions may put the need for
security in jeopardy. In one particular example, the
staff of an Accident and Emergency Department in a
UK hospital (Collins, 2007) found that putting their
smart card and their password into a computer every
time when they wanted a patient record was taking
precious minutes away from treating patients. So
they decided that the leader of the shift would put
his card into a computer at the beginning of the shift
and leave it there for everybody to use. They also
decided to make public what they are doing to
challenge what they regard as a time consuming and
inappropriate way of protecting patient security. The
feature of this example was that the actions of
employees are creating the potential for security
breaches that may be serious for patients and in turn
for the NHS Trust. However, they are doing this not
out of malice or ignorance; they are doing it because
the constraints of the security procedures are getting
in the way of what they regard as legitimate ways of
undertaking their work. In this case, and potentially
in many others, the need is for security policies,
procedures, and technical approaches that are
accepted by employees and are found to be
workable.
ICEIS 2008 - International Conference on Enterprise Information Systems
312