necessary. The objective is to reach a relevant and
homogeneous model, having a limited number of
classes (between 10 and 15), each of them covering
a limited number of requirements.
Step 2 – Design of the assessment tool: Once the
structure is set, a questionnaire shall be accordingly.
Pragmatically speaking, for a light tool dedicated to
SMEs, it is not reasonable to have a question for
each requirement. It is thus necessary to define a
hierarchical questionnaire, composed of general
questions, providing an overview of the topic
assessed in each class. These general questions are
completed with more precise ones, assessing each
requirement of the standard, but used only if
necessary. By rewriting and aggregating the
questions, this step should produce a comprehensive
and light questionnaire, later supported by a
software tool.
Step 3 – Experimentations: Once a stable version of
the tool is defined, the results shall be validated
through experiments. Two experiments are planned.
They shall bring feedbacks regarding the tool, and
demonstrate its efficiency compared to the
traditional approach performed during our initial
experiment.
Once these three steps are performed
sequentially, this process shall be performed again in
an iterative and incremental manner in order to take
advantage of the feedbacks gathered during
experiments.
5 MODELLING OF THE ISO/IEC
27001 REQUIREMENTS
As stated previously, the first step of our research
method aimed at the simplification of the structure
of the standard. We first defined a set of coarse-
grained categories related with the key topics of the
standard (e.g., documentation management,
resources management, etc.) that were elicited
during our first experiment. Following a previous
work (Valdevit et al., 2009), we distributed all
requirements over this set of pragmatic categories
representing major activities of the ISMS. We
proceeded through iterative analysis, refining our
classification. For each requirement not fitting in
any of our classes, we created a new one, or
extended the scope of an existing one. After the last
iteration, each requirement of the standard’s core
was linked to a suited category. The same process
has been performed with the standard’s appendix,
mapping its 133 security measures in different
categories.
The final task consisted in merging these 2 sets
of requirements to delete redundancies between the
core of the standard and its appendix. Indeed, some
security controls of the appendix, like incident
management or security awareness, are also
mentioned as requirements within the core of the
standard. As a result, 4 classes were merged,
addressing security management, human resources
management, monitoring and review.
In the end, the final set of classes (after the
experimentation step depicted in Section 7) was
reduced to 10 topics.
6 DESIGN OF THE ASSESSMENT
TOOL
In order to satisfy the second step of our research
method, most of our work consisted in the design of
a pragmatic, clear and hierarchically organised
questionnaire. The objective here was to assess the
coverage level of an organisation for each topic with
as few questions as possible.
As a result, there are only a couple of general
questions for each class. Those open and global
questions let the interlocutor answer freely.
However, to ensure a complete coverage of each
requirement, we implemented complementary sub-
questions. Those closed questions shall be used only
to assess a precise requirement, not covered by the
answers to the open questions. They are thus rarely
asked, but they serve as a support when additional
information is required.
For each question, a 5-level rating is proposed,
inspired by a standard on process assessment:
ISO/IEC 15504 (ISO, 2003). These rating levels
provide a progressive scale to assess the current
practices within the organisation: N/A (requirement
is intentionally ignored), not covered (no practice is
done), partially covered (partially satisfied or in
progress), largely covered (done but not sufficiently
documented) or fully covered (satisfied and
sufficiently documented).
In the end, the tool proposes about forty
questions for assessing the coverage level of the 10
classes defined in Section 5. For a better
comprehension and to ease the conclusions, the tool
summarises the results of an assessment within two
charts: a radar summarising the coverage of the
organisation’s practices with regards to the
requirements of the standard and a bar graph
A GAP ANALYSIS TOOL FOR SMES TARGETING ISO/IEC 27001 COMPLIANCE
415