as in (Chilenski and Newcomb 1994). Although
our aim in this paper is to generate MC/DC-
compliant test suites, having this broader idea in
mind would probably facilitate the introduction
of other criteria if need be.
- The components of the tool must comply with
the ‘separation of concerns’ rule. Separation of
concerns (SoC) is the process of breaking a
program into distinct features that overlap in
functionality as little as possible (Jackson 2006).
In this case, the components; program analyzer,
strategy handler and generator must be
implemented in such a way that amendments in
one of the components should not affect others in
great deal. For example, if the instrumentation of
the code in the program analyzer depends on the
criterion formulated in Strategy Handler, this
may cause huge amount of modifications when
the decision for the criterion used is changed.
Figure 6: A different view to test case generation.
Figure-6 gives a different view to the test case
generation process. The main idea emphasized in
this figure is the fact that MC/DC is separated from
the other components of the generator and therefore
its formalization can be handled separately, for
instance, by using Z notation, provided that the
generator is able to interpret this notation. Another
message given by Figure 6 is that the code
transformed into a format that can be understood by
the generator, must still be consistent with the
formal specifications of the program.
There is certainly more work to be done in this
subject, however the strengths of the tools and the
techniques outlined in this paper will certainly draw
a guideline in the production of future tools to cover
MC/DC and we continue to explore the use of
formal methods to achieve MC/DC-compliant test
case generation.
REFERENCES
AC#20-115B (2003). Advisory Circular (AC) # 20-115B,
FAA.
Adrion, R., Branstad, M. and Cherniavsky, J., (1982).
Validation, Verification and Testing of Computer
Software. Computing Surverys, ACM.
Ammann, P., Offutt, J. and Huang, H., (2003). Coverage
Criteria for Logical Expression. International
Symposium on Software Reliability Engineering
(ISSRE '03).
Cavarra, A. , Crichton, C., Davies, J., Hartman, A., Jeron,
T. and Monier, L., (2002). Using UML for Automatic
Test Generation. Proceedings of ISSTA.
Certification Authorities Software Team (CAST), 2001.
Rationale for Accepting Masking MC/DC in
Certification Projects.
Chilenski, J.J. and Miller, S.P., (1994). Applicability of
Modified Condition/Decision Coverage to Software
Testing. Software Engineering Journal.
Chilenski, J.J. and Newcomb, P.H., (1994). Formal
Specification Tools for Test Coverage Analysis. The
Boeing Company.
Díaz, E., Tuya, J. and Blanco, R., (2004). A Modular Tool
for Automated Coverage in Software Testing,
Software Technology and Engineering Practice. IEEE
CS Press, pp. 234-240.
DO-178B, (1992). DO-178B: Software Considerations in
Airborne Systems and Equipment Certification,
RTCA, Washington D.C., USA.
Durrieu, G., Laurent, O., Seguin, C. and Wiels, V., (2004),
Automatic Test case Generation for Critical Embedded
Systems. DASIA’04.
Edvardsson, J., (1999), A Survey on Test Data Generation,
ECSEL.
Ferguson R. and Korel B., (1996). The Chaining Approach
for Software test data generation. ACM Transactions
on Software Engineering and Methodology, 5(1):63-
86.
Hayhurst, K.,Veerhusen, D., Chilenski, J. and Rierson,
L.K., (2001). A Practical Tutorial on Modified
Condition/Decision Coverage. NASA.
Jackson, M., (2006). What can we expect from program
verification?. Innovative Technology for Computer
Professionals 39, no.10, PG 65-71.
Kapoor, K. and Bowen, J.,(2004).Formal Analysis of
MCDC and RCDC Test Criteria. London South Bank
University.
Korel, B. and Al-Yami, A. M., (1996). Assertion-oriented
automated test data generation. Proceedings of the
18th International Conference on Software
Engineering, (ICSE), pages 71-80. IEEE.
Prasanna, M., Sivanandam, S.N., Venkatesan, R. and
Sundarrajan, R., (2005). A Survey on Automatic Test
Case Generation. Academic Open Internet
Journal,Volume 15.
Tracey, N., Clark, J., Mader, K. and McDermid, J., (1998).
An Automated Framework for Structural Test Data
Generation. 13
th
IEEE International Conference on
Automated Software Engineering.
GOAL-ORIENTED AUTOMATIC TEST CASE GENERATORS FOR MC/DC COMPLIANCY
295