resented by the existence of milestone versions, which
have a large variance in quality. These are character-
ized by the addition of many new features, which is
also apparent while using the application. In the case
of jEdit 4.0pre4 and TuxGuitar 1.0rc1, these are also
evident from version numbers. However, the opposite
is not true.
Regarding the quality models employed in our
case study, we believe each of the methods can be
useful in discovering maintainability changes. While
beyond the scope of our present paper, all three mod-
els can be used with finer granularity, at package and
class levels. Each model contributed insight regarding
the quality of the target applications, and we believe
that comprehensive analysis using several models is
useful in order to improve the characterization of soft-
ware quality.
We intend to expand our study to cover addi-
tional system types prevalent today, such as frame-
works, distributed applications or those targeting mo-
bile devices. We aim to further study system size
and complexity beyond line of code and class counts
in order to improve our understanding of its relation
with maintainability. In addition, we believe a cross-
sectional approach is also valuable, as it can improve
our baseline by facilitating the study of a larger num-
ber of target applications.
REFERENCES
ARISA Compendium, VizzMaintenance (2019). Technical
documentation of the VizzMaintenance metric extrac-
tion tool.
Arlt, S., Banerjee, I., Bertolini, C., Memon, A. M., and
Schaf, M. (2012). Grey-box gui testing: Efficient gen-
eration of event sequences. CoRR, abs/1205.4928.
Barkmann, H., Lincke, R., and L
¨
owe, W. (2009). Quanti-
tative evaluation of software quality metrics in open-
source projects. In 2009 International Conference on
Advanced Information Networking and Applications
Workshops, pages 1067–1072.
Caldiera, V. R. B. G. and Rombach, H. D. (1994). The Goal
Question Metric approach. Encyclopedia of software
engineering, pages 528–532.
Chidamber, S. and Kemerer, C. (1994). A Metric Suite for
Object- Oriented Design. IEEE Transactions on Soft-
ware Engineering, 20(6):476–493.
Cunningham, W. (1992). The wycash portfolio manage-
ment system. SIGPLAN OOPS Mess., 4(2):29–30.
Darcy, D. P. and Kemerer, C. F. (2005). Oo metrics in prac-
tice. IEEE Softw., 22(6):17–19.
DeMarco, T. (1982). Controlling Software Projects; Man-
agement, Measurement and Estimation. Yourdan
Press, New Jersey.
D
¨
ohmen, T., Bruntink, M., Ceolin, D., and Visser, J. (2016).
Towards a benchmark for the maintainability evolu-
tion of industrial software systems. 2016 Joint Con-
ference of the International Workshop on Software
Measurement and the International Conference on
Software Process and Product Measurement (IWSM-
MENSURA), pages 11–21.
Fowler, M. (2019). Technical debt. https://martinfowler.
com/bliki/TechnicalDebt.html.
Heitlager, I., Kuipers, T., and Visser, J. (2007). A practi-
cal model for measuring maintainability. In Quality of
Information and Communications Technology, 6th In-
ternational Conference on the Quality of Information
and Communications Technology, QUATIC 2007, Lis-
bon, Portugal, September 12-14, 2007, Proceedings,
pages 30–39.
Hynninen, T., Kasurinen, J., and Taipale, O. (2018). Frame-
work for observing the maintenance needs, runtime
metrics and the overall quality-in-use. Journal of Soft-
ware Engineering and Applications, 11:139–152.
ISO/IEC 25010 (2011). Software quality standards. http:
//www.iso.org.
Letouzey, J.-L. (2012). The sqale method for evaluating
technical debt. In Proceedings of the Third Interna-
tional Workshop on Managing Technical Debt, MTD
’12, pages 31–36. IEEE Press.
Li, W. and Henry, S. (1993). Maintenance metrics for the
object oriented paradigm. IEEE Proc. First Interna-
tional Software Metrics Symp, pages 52–60.
Lincke, R., Lundberg, J., and L
¨
owe, W. (2008). Comparing
software metrics tools. In Proceedings of the 2008 in-
ternational symposium on Software testing and anal-
ysis - ISSTA ’08.
Marinescu, R. (2002). Measurement and Quality in Object
Oriented Design. PhD thesis, Faculty of Automatics
and Computer Science, University of Timisoara.
Metrics library, N. (2019). https://github.com/etishor/
Metrics.NET.
Microsoft VS Docs (2020). https://docs.microsoft.com/
en-us/visualstudio/code-quality/code-metrics-values.
Molnar, A. and Motogna, S. (2017). Discovering maintain-
ability changes in large software systems. In Proceed-
ings of the 27th International Workshop on Software
Measurement and 12th International Conference on
Software Process and Product Measurement, IWSM
Mensura ’17, pages 88–93, New York, NY, USA.
ACM.
Molnar., A., Neamt¸u., A., and Motogna., S. (2019). Lon-
gitudinal evaluation of software quality metrics in
open-source applications. In Proceedings of the
14th International Conference on Evaluation of Novel
Approaches to Software Engineering - Volume 1:
ENASE,, pages 80–91. INSTICC, SciTePress.
Naboulsi, Z. (2011). Maintainability Index in
Microsoft Visual Studio. https://blogs.
msdn.microsoft.com/zainnab/2011/05/26/
code-metrics-maintainability-index/.
Oman, P. and Hagemeister, J. (1992). Metrics for as-
sessing a software system’s maintainability. In Pro-
ceedings Conference on Software Maintenance 1992,
pages 337–344.
R. Lincke, W. Lowe (2019). Compendium of Software
Quality Standards and Metrics.
ENASE 2020 - 15th International Conference on Evaluation of Novel Approaches to Software Engineering
130