
possible causes, put them into cause categories, rate
them by user-defined criteria, and select which ones
to pursue. The prototype allows users to categorize
causes so the recommended solutions are more
likely to address the underlying causes. The specific
process used in this version is described in more
detail in Douglas et al, 2003.
The focus of this system is the actual roles
people perform in an organisation (as opposed of
their position titles) and the goals they are expected
to achieve. This is modelled in an analysis system
with the models providing a framework for
performance metrics. The gaps in organisational
performance evident in these metrics are used to
initiate support systems development (software
tools, training courses).
Rationale management (Moran and Carroll,
1996, Burge and Brown, 2000) is integrated into the
system. Rationale management allows auditing of
decision making when solutions resulting from
analysis fail to make an impact on organisational
performance. In addition to the capture of informal
rationale information, through archiving of online
discussions, a rationale diagram can be
automatically generated from the data entered into a
system. Figure 3 illustrates the rationale diagram
generated by the current prototype. For each
performance goal, gaps in performance leading to
the goals are entered, and those selected for further
analysis are indicated by a tick. For each gap
selected, the potential causes for the gap are
indicated. For each selected cause for the gap, the
potential solutions considered are indicated, and a
tick will show the solutions chosen for
implementation.
3 FUTURE WORK
The concept of configurability is an important part
of the work carried out to date. The framework on
which the model is based is meant to provide a
structure for a variety of methods that can be tailored
to specific groups or situations. The same is true for
software architecture. A fixed tool based on one
specific methodology is likely to be of limited use.
This concept is difficult to demonstrate and test
when there is only one instance conforming to the
model. A second prototype is being constructed,
which is conformant to the framework, but is
customised to the specific data collection methods,
terminology and collaborations tools used by the US
Coast Guard’s Human Performance Technology
Centre. Once more than one instance of an
organisational performance analysis tool is
available, it will be possible to investigate the
possibility of translating performance analysis data
between different tools conformant to the
framework. Domain ontology will be used to
facilitate this.
REFERENCES
Burge, J. and Brown, D. C., 2000. Reasoning with design
rationale. In John Gero (ed.) – Artificial Intelligence
in Design ’00, Kluwer Academic Publishers,
Dordrecht, The Netherlands, 611-629.
Cameron, J., 2002. Configurable development process.
Communications of the ACM. (Vol. 45, No. 3), 72-77.
Cockburn. A., 1997. Structuring use cases with goals.
Journal of Object Oriented Programming, 10 (7), 35–
40.
Douglas, I., Nowicki, C., Butler, J. and Schaffer S., 2003.
Web-based collaborative analysis, reuse and sharing of
human performance knowledge. Proceedings of the
Inter-service/Industry Training, Simulation and
Education Conference (I/ITSEC). Orlando, Florida,
Dec.
Douglas, I. and Schaffer, S., 2002. Object-oriented
performance improvement. Performance Improvement
Quarterly. 15 (3) 81-93.
Flowers, S., 1996. Software failure: management failure.
Amazing stories and cautionary tales. Chichester,
New York: John Wiley.
Gilbert, T., 1996. Human competence: Engineering
worthy performance (Tribute Edition). Amherst, MA:
HRD Press, Inc.
Marshall, C., 2000. Enterprise Modeling with UML:
Designing Successful Software through Business
Analysis. Reading, MA: Addison-Wesley.
Moran, T.P. and Carroll, J.M., 1996. Design rationale:
concepts, techniques, and use. Mahwah, NJ: Lawrence
Erlbaum Associates.
Robinson, D. and Robinson J.C., 1995. Performance
Consulting: Moving Beyond Training. San Francisco:
Berrett-Koehler. References
Rossett, A., 1999. First Things Fast: A Handbook for
Performance Analysis. San Francisco: Jossey-Bass
Pfeiffer.
Weinberg, G., 2001. An introduction to general systems
thinking. New York: Dorset House.
Yao, C., Lin, K.J., and Mathieu R.G., 2003. Web Services
Computing: Advancing Software Interoperability.
IEEE Computer, October, 36 (10), 35-37.
Ye, Y., and Fischer, G., 2002. Supporting reuse by
delivering task-relevant and personalized
information. Proceedings of the Twenty-fourth
International Conference on Software
Engineering, 513-523.
FOUNDING ENTERPRISE SYSTEMS ON ENTERPRISE PERFORMANCE ANALYSIS
591