This demonstrates that to the best of our knowl-
edge our tool Dacite is the only tool that analyzes
the data flow of an object-oriented program consid-
ering the fields of objects and the elements of arrays
in detail, respecting aliasing, focussing on traversable
DUCs and hence providing a comprehensive data-
flow analysis.
6 CONCLUSIONS
In this paper, we have presented Dacite, a tool for a
comprehensive data-flow analysis for object-oriented
programs. This tool is able to dynamically analyze the
data flow of a given program and its JUnit tests. By
collecting the necessary information during the exe-
cution, more detailed and precise information such
as the occurrences of aliases can be derived. More-
over, we have argued that by dynamically analyzing
the program, only traversable data-flow relations are
identified. It is noteworthy, that all other existing tools
for analyzing the data flow of object-oriented pro-
grams only provide limited identification of the data
flow, being restricted either by static analysis or by
a specific type of variables (see Section 5). Further-
more, these tools are either not publicly available or
not working reliably. Consequently, to the best of our
knowledge, Dacite is the only available tool to pro-
vide a comprehensive and detailed data-flow analysis.
This can be utilized to evaluate and assess test suites
for object-oriented systems.
In the future, we plan to extend Dacite by graphi-
cal visualization of the DUCs to further facilitate the
comprehensibility of the passed data flow. Moreover,
by deriving the data-flow information during the exe-
cution, only those paths and their data flows are con-
sidered that have been passed. When comparing JU-
nit tests, this is sufficient. However, when generating
new Junit tests, it would be beneficial to also have in-
formation on which data flows have not been passed
yet. In order to ensure that all data flows are cov-
ered, Dacite can be combined with our existing test-
case generator (Winkelmann et al., 2022) which is
based on a symbolic execution of the classes under
test. Dacite can then be used to restrict the number
of generated test cases to those which are required to
ensure data-flow coverage.
REFERENCES
Allen, F. E. and Cocke, J. (1976). A program data flow
analysis procedure. Communications of the ACM,
19(3):137.
Beyer, D. (2021). Software verification: 10th comparative
evaluation (sv-comp 2021). Proc. TACAS (2). LNCS,
12652.
Bluemke, I. and Rembiszewski, A. (2009). Dataflow test-
ing of java programs with dfc. In IFIP Central and
East European Conference on Software Engineering
Techniques, pages 215–228. Springer.
de Araujo, R. P. A. and Chaim, M. L. (2014). Data-flow test-
ing in the large. In 2014 IEEE Seventh International
Conference on Software Testing, Verification and Val-
idation, pages 81–90. IEEE.
Denaro, G., Gorla, A., and Pezz
`
e, M. (2008). Contextual
integration testing of classes. In International Confer-
ence on Fundamental Approaches to Software Engi-
neering, pages 246–260. Springer.
Denaro, G., Margara, A., Pezze, M., and Vivanti, M. (2015).
Dynamic data flow testing of object oriented systems.
In 2015 IEEE/ACM 37th IEEE International Confer-
ence on Software Engineering, volume 1, pages 947–
958. IEEE.
Denaro, G., Pezze, M., and Vivanti, M. (2014). On the right
objectives of data flow testing. In 2014 IEEE Seventh
International Conference on Software Testing, Verifi-
cation and Validation, pages 71–80. IEEE.
Frankl, P. G. and Iakounenko, O. (1998). Further empiri-
cal studies of test effectiveness. SIGSOFT Softw. Eng.
Notes, 23(6):153–162.
Harrold, M. J. and Soffa, M. L. (1989). Interprocedual data
flow testing. ACM SIGSOFT Software Engineering
Notes, 14(8):158–167.
Majchrzak, T. A. and Kuchen, H. (2009). Automated test
case generation based on coverage analysis. In 2009
Third IEEE International Symposium on Theoretical
Aspects of Software Engineering, pages 259–266.
Misurda, J., Clause, J., Reed, J., Childers, B. R., and Soffa,
M. L. (2005). Jazz: A tool for demand-driven struc-
tural testing. In International Conference on Compiler
Construction, pages 242–245. Springer.
Pande, H. D., Landi, W. A., and Ryder, B. G. (1994). In-
terprocedural def-use associations for c systems with
single level pointers. IEEE Transactions on Software
Engineering, 20(5):385–403.
Santelices, R. and Harrold, M. J. (2007). Efficiently mon-
itoring data-flow test coverage. In Proceedings of
the twenty-second IEEE/ACM international confer-
ence on Automated software engineering, pages 343–
352.
Su, T., Wu, K., Miao, W., Pu, G., He, J., Chen, Y., and
Su, Z. (2017). A survey on data-flow testing. ACM
Computing Surveys (CSUR), 50(1):1–35.
Vincenzi, A., Wong, W., Delamaro, M., and Maldonado,
J. (2003). Jabuti: A coverage analysis tool for java
programs. XVII SBES–Simp
´
osio Brasileiro de Engen-
haria de Software, pages 79–84.
Vincenzi, A. M. R., Delamaro, M. E., Maldonado, J. C.,
and Wong, W. E. (2006). Establishing structural test-
ing criteria for java bytecode. Software: practice and
experience, 36(14):1513–1541.
Winkelmann, H., Troost, L., and Kuchen, H. (2022).
Constraint-logic object-oriented programming for test
case generation. In Proceedings of the 37th
ACM/SIGAPP Symposium On Applied Computing,
pages 1490–1499.
ENASE 2022 - 17th International Conference on Evaluation of Novel Approaches to Software Engineering
274