architectonic design, and i*modeling support, which
is a framework suggesting an approach oriented
towards RE goals and agents), with other RE
supporting tools oriented to goals, agents, and
aspects. The results obtained from the evaluation are
detailed in the Table 7.
In sum, approximately 64 hours were incurred for
instantiation application. The process was performed
as follows: the tools’ most recent versions were
downloaded from the corresponding development
community website and installed. Then, applications
were run one by one and verified for metrics
compliance. For evaluation of Maintainability
metrics, we reviewed current information and
documentation at the official tool website.
Information required for software qualification could
not be found for some metrics; therefore in these
cases, a minimum punctuation was awarded (1).
For the Stability feature in the Maintainability
category, where no security patches were founds, the
formula was deemed a ratio (patches solved/patches
found) expressed in percentage values; 100% was
awarded.
The results obtained for the Functionality category:
StarUML obtained 0%, OSRMT 14.29%, UCM 0%,
and OpenOME 14.29%. According to the MOSCA
algorithm, none met the level of acceptance
required, which is 75%. Regarding results obtained
for Usability, all four tools exceeded the 75%
required. OpenOME, StarUML and OSRMT
obtained the highest punctuation at 87.50%,
followed by UCM with 75%. Lastly, for
Maintainability, all 4 tools exceeded 75%,
OpenOME with 90.91%, followed by StarUML,
OSRMT and UCM with 81.82%.
Given that none of the 4 tools met the minimum
satisfaction percentage required for Functionality,
they qualified as null quality tools. Nevertheless,
these four tools are above 75% for Usability and
Maintainability. According to the MOSCA
algorithm, when Functionality results do not reach
75%, the evaluation is suspended, but in this case, as
we are evaluating FLOSS tools, there is the
possibility of adding new functionalities to the tools,
as opposed to proprietary software tools, which do
not allow for modifications. Therefore, we are free
to choose a tool and subject it to any improvement
relating to the characteristics deemed appropriate for
our research. The selected tool was UCM, a tool
developed in C# language. It should be noted that
results presentation goes beyond the scope of this
article and will be addressed in future works.
6 CONCLUSIONS
This work proposes an instantiation of the MOSCA
model to measure the quality of FLOSS-based
software engineering tools supporting RE, which
should be easy to use and modify. This model was
applied to Open Source Requirement Management
Tool (OSRMT), StarUML, Use Case Maker (UCM)
and OpenOME, to prove the model usability and
select the most suitable tool to be modified in the
near future. In this case, the tool selected was Use
Case Maker. MOSCA may be adapted to any RE
tools with specific characteristics and may be used
by Small and Medium-sized Enterprises (SMEs) for
tool evaluation purposes.
ACKNOWLEDGEMENTS
This research has been financed by FONACIT
Venezuela, Project G-2005000165. Special thanks
to A. Sevilla.
REFERENCES
Alfonso, O., Domínguez, K, Rivas, L., Perez, M.,
Mendoza, L., & Ortega, M. (2008). Quality
Measurement Model for Analysis and Design Tools
based on FLOSS. 19th Australian Software
Engineering Conference (ASWEC 2008). Libro:
"Proceedings of the 19th Australian Software
Engineering Conference (ASWEC 2008)". Vol. 1. pp.
258 - 267
Perth, Australia. Basili, V., Caldiera, G. y Rombach, H.,
“The Goal Question Metric Approach”, en: Marciniak,
J. J. (ed.), Encyclopedia of Software Engineering,
Wiley, pp. 528–532, 2001.
Baskerville, R., “Investigating Information Systems with
Action Research”, Communications of the Association
for Information Systems, vol. 2, nº 19, pp. 1-32, 1999.
Dromey, G., “A Model for Software Product Quality”,
IEEE Transactions on Software Engineering, vol. 21,
nº 2, pp. 146-162, 1995.
ISO/IEC 9126-1, Software Engineering. Product Quality.
Part 1: Quality Model, ISO, 2001.
Kitchenham, B., “Evaluating Software Engineering
Methods and Tools. Part 1: The Evaluation Context
and Evaluation Methods”, ACM Software Engineering
Notes, vol. 21, nº 1, pp. 11- 14, 1996.
Mendoza, L., Pérez, M. y Grimán, A., “Prototipo de
modelo sistémico de calidad (MOSCA) del software”,
Computación y Sistemas, vol. 8, nº 3, pp. 196-217,
2005.
Pessagno, L., Domínguez, K., Rivas, L., Pérez, M.,
Mendoza, L., & Mendez, E. (2008). Modelo de
calidad para herramientas FLOSS que dan apoyo al
modelado de procesos del negocio. X Jornadas sobre
QUALITY MEASUREMENT MODEL FOR REQUIREMENTS ENGINEERING FLOSS TOOLS
253