Cost-effective Functional Testing of Reactive Software

R. Venkatesh, Ulka Shrotri, Amey Zare, Supriya Agrawal

2015

Abstract

Creating test cases to cover all functional requirements of real-world systems is hard, even for domain experts. Any method to generate functional test cases must have three attributes: (a) an easy-to-use formal notation to specify requirements, from a practitioner’s point of view, (b) a scalable test-generation algorithm, and (c) coverage criteria that map to requirements. In this paper we present a method that has all these attributes. First, it includes Expressive Decision Table (EDT), a requirement specification notation designed to reduce translation efforts. Second, it implements a novel scalable row-guided random algorithm with fuzzing (RGRaF)(pronounced R-graph) to generate test cases. Finally, it implements two new coverage criteria targeted at requirements and requirement interactions. To evaluate our method, we conducted experiments on three real-world applications. In these experiments, RGRaF achieved better coverage than pure random test case generation. When compared with manual approach, our test cases subsumed all manual test cases and achieved up to 60% effort savings. More importantly, our test cases, when run on code, uncovered a bug in a post-production sub-system and captured three missing requirements in another.

References

  1. (1994). DO-1'B: Software Considerations in Airborne Systems and Equipment Certification.
  2. Anand, S., Burke, E. K., Chen, T. Y., Clark, J., Cohen, M. B., Grieskamp, W., Harman, M., Harrold, M. J., and Mcminn, P. (2013). An orchestrated survey of methodologies for automated software test case generation. J. Syst. Softw., 86(8):1978-2001.
  3. Arcuri, A., Iqbal, M., and Briand, L. (2010). Black-box system testing of real-time embedded systems using random and search-based testing. In Petrenko, A., Simo, A., and Maldonado, J., editors, Testing Software and Systems, volume 6435 of Lecture Notes in Computer Science, pages 95-110. Springer Berlin Heidelberg.
  4. Bokil, P., Darke, P., Shrotri, U., and Venkatesh, R. (2009). Automatic test data generation for c programs. In Secure Software Integration and Reliability Improvement, 2009. SSIRI 2009. Third IEEE International Conference on, pages 359-368. IEEE.
  5. Bowman, H. and Gomez, R. (2006). Discrete timed automata. In Concurrency Theory, pages 377-395. Springer London.
  6. Brat, G., Havelund, K., Park, S., and Visser, W. (2000). Java pathfinder - second generation of a java model checker. In In Proceedings of the Workshop on Advances in Verification.
  7. Briand, L. (2010). Software verification - a scalable, modeldriven, empirically grounded approach. In Tveito, A., Bruaset, A. M., and Lysne, O., editors, Simula Research Laboratory, pages 415-442. Springer Berlin Heidelberg.
  8. Cadar, C., Dunbar, D., and Engler, D. R. (2008). Klee: Unassisted and automatic generation of high-coverage tests for complex systems programs. In OSDI, volume 8, pages 209-224.
  9. Cadar, C., Godefroid, P., Khurshid, S., Pa?sa?reanu, C. S., Sen, K., Tillmann, N., and Visser, W. (2011). Symbolic execution for software testing in practice: Preliminary assessment. In Proceedings of the 33rd International Conference on Software Engineering, ICSE 7811, pages 1066-1071, New York, NY, USA. ACM.
  10. Cadar, C. and Sen, K. (2013). Symbolic execution for software testing: three decades later. Commun. ACM, 56(2):82-90.
  11. Chen, T. Y., Kuo, F.-C., Merkel, R. G., and Tse, T. (2010). Adaptive random testing: The {ART} of test case diversity. Journal of Systems and Software, 83(1):60 - 66. SI: Top Scholars.
  12. Cristiá, M., Albertengo, P., Frydman, C., Plüss, B., and Monetti, P. R. (2014). Tool support for the test template framework. Software Testing, Verification and Reliability, 24(1):3-37.
  13. Dalal, S. R., Jain, A., Karunanithi, N., Leaton, J., Lott, C. M., Patton, G. C., and Horowitz, B. M. (1999). Model-based testing in practice. In Proceedings of the 21st international conference on Software engineering, pages 285-294. ACM.
  14. Duran, J. W. and Ntafos, S. C. (1984). An evaluation of random testing. IEEE Trans. Softw. Eng., 10(4):438- 444.
  15. Ferguson, R. and Korel, B. (1996). The chaining approach for software test data generation. ACM Trans. Softw. Eng. Methodol., 5(1):63-86.
  16. Hamlet, R. (2002). Random Testing. John Wiley & Sons, Inc.
  17. Harel, D., Lachover, H., Naamad, A., Pnueli, A., Politi, M., Sherman, R., Shtull-Trauring, A., and Trakhtenbrot, M. (1990). Statemate: A working environment for the development of complex reactive systems. Software Engineering, IEEE Transactions on, 16(4):403-414.
  18. Heitmeyer, C., Kirby, J., Labaw, B., and Bharadwaj, R. (1998). Scr: A toolset for specifying and analyzing software requirements. In Computer Aided Verification, pages 526-531. Springer.
  19. Marinov, D., Andoni, A., Daniliuc, D., Khurshid, S., and Rinard, M. (2003). An evaluation of exhaustive testing for data structures. Technical report, MIT Computer Science and Artificial Intelligence Laboratory Report MIT -LCS-TR-921.
  20. Offutt, J., Liu, S., Abdurazik, A., and Ammann, P. (2003). Generating test data from state-based specifications. Software Testing, Verification and Reliability, 13(1):25-53.
  21. Peranandam, P., Raviram, S., Satpathy, M., Yeolekar, A., Gadkari, A., and Ramesh, S. (2012). An integrated test generation tool for enhanced coverage of simulink/stateflow models. In Design, Automation & Test in Europe Conference & Exhibition (DATE), 2012, pages 308-311. IEEE.
  22. Pa?sa?reanu, C. S. and Rungta, N. (2010). Symbolic pathfinder: Symbolic execution of java bytecode. In Proceedings of the IEEE/ACM International Conference on Automated Software Engineering, ASE 7810, pages 179-180, New York, NY, USA. ACM.
  23. Raymond, P., Nicollin, X., Halbwachs, N., and Weber, D. (1998). Automatic testing of reactive systems. In Real-Time Systems Symposium, 1998. Proceedings., The 19th IEEE, pages 200-209. IEEE.
  24. Reactis. Reactis. http://www.reactive-systems.com/modelbased-testing-simulink.html. [Online; accessed 3- Dec-2014].
  25. Tahat, L. H., Vaysburg, B., Korel, B., and Bader, A. J. (2001). Requirement-based automated black-box test generation. In Computer Software and Applications Conference, 2001. COMPSAC 2001. 25th Annual International, pages 489-495. IEEE.
  26. Thyssen, J. and Hummel, B. (2013). Behavioral specification of reactive systems using stream-based i/o tables. Software & Systems Modeling, 12(2):265-283.
  27. Tillmann, N. and De Halleux, J. (2008). Pex-white box test generation for. net. In Tests and Proofs, pages 134- 153. Springer.
  28. Veanes, M., Campbell, C., Grieskamp, W., Schulte, W., Tillmann, N., and Nachmanson, L. (2008). Modelbased testing of object-oriented reactive systems with spec explorer. In Formal methods and testing, pages 39-76. Springer.
  29. Venkatesh, R., Shrotri, U., Krishna, G. M., and Agrawal, S. (2014). Edt: a specification notation for reactive systems. In Proceedings of the conference on Design, Automation & Test in Europe, page 215. European Design and Automation Association.
  30. Wang, J., Li, H., Lv, T., Wang, T., and Li, X. (2014). Functional test generation guided by steady-state probabilities of abstract design. In Proceedings of the conference on Design, Automation & Test in Europe, page 321. European Design and Automation Association.
  31. winAMS, C. Coveragemaster winams. http://www.gaio.com/product/dev tools/pdt07 winams.html. [Online; accessed 3-Dec-2014].
Download


Paper Citation


in Harvard Style

Venkatesh R., Shrotri U., Zare A. and Agrawal S. (2015). Cost-effective Functional Testing of Reactive Software . In Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE, ISBN 978-989-758-100-7, pages 67-77. DOI: 10.5220/0005347800670077


in Bibtex Style

@conference{enase15,
author={R. Venkatesh and Ulka Shrotri and Amey Zare and Supriya Agrawal},
title={Cost-effective Functional Testing of Reactive Software},
booktitle={Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE,},
year={2015},
pages={67-77},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005347800670077},
isbn={978-989-758-100-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE,
TI - Cost-effective Functional Testing of Reactive Software
SN - 978-989-758-100-7
AU - Venkatesh R.
AU - Shrotri U.
AU - Zare A.
AU - Agrawal S.
PY - 2015
SP - 67
EP - 77
DO - 10.5220/0005347800670077