In the black box testing framework, we do not need to be familiar with the infra-
structure of the performance management system as we only target the web pages of
the Enterprise PM Portal. There are efforts needed to analyze user scenarios and
record user actions as test scripts. The test scripts were saved in OpenSTA project
files and copied between client computers. Test scripts can be replayed by client
computers to create required traffic on the system, and then reports are automatically
created after test runs for analyzing collected data. This part was finished in one day.
The integrated QA framework, on the other hand required lots of effort to collect
log data into the data warehouse. First, we needed to enable log functions for system
components. Second, we needed to analyze logs and implement a log loader for them.
The ETL process involves running the log loader and executing database SQL com-
mands or functions, and double checking the data collected to ensure the quality of
the data collected. Third, we used a metadata modeling tool to model collected data
dimensionally in the data warehouse and publish a metadata model for reporting.
Finally, the quality assurance reports and portal needed to be designed and created
manually. It took one month to set it up.
In terms of maintenance for the black box testing framework, changes on web pag-
es will cause OpenSTA to redo the work of recording and replying test scripts, but it
is still quite simple and can be done in one day.
For the integrated quality assurance framework, significant maintenance efforts are
needed when there are changes in the system infrastructure, for example:
• Changes in applications or services for the performance management sys-
tem will bring different logs into the quality assurance framework. The
log loader has to be compatible with those logs and the metadata model of
the quality assurance data mart may need to be changed too.
• Changes on the granularity of logs of a component will cause configura-
tions on the log loader and related changes between metadata objects in
the quality assurance data mart.
• Changes on the goal of quality assurance may ask for new reports or mod-
ification on existing reports, or the quality assurance portal.
In general, all of this work can be done in one week. It can be faster if the inte-
grated framework is designed to be compatible with different service and component
configurations.
Implementation and maintenance effort is more significant with the integrated
quality assurance framework, but once in place the effort associated with assessing
quality assurance is greatly reduced. With the black box testing framework, the effort
needed to notice a quality issue is usually high. First, there is no alert provided to
users for system exceptions, users need to discover them manually. Second, although
a simple reporting capability is usually bundled into black box testing tools, there is
typically no relationship set up between reports and there is no portal to help users
easily go through reports. To find quality issues, users need go through all reports and
figure out the data relation between reports; this is difficult even for quality assurance
experts. As well due to the limitation on the type of data collected, there is no support
for assessing quality issues such as the usage and privacy of the system.
Reports in the integrated quality assurance framework are created based on the
quality assurance data mart model, a star schema, and reports content can be custo-
136