A Container-centric Methodology for Benchmarking Workflow Management Systems

Vincenzo Ferme, Ana Ivanchikj, Cesare Pautasso, Marigianna Skouradaki, Frank Leymann


Trusted benchmarks should provide reproducible results obtained following a transparent and well-defined process. In this paper, we show how Containers, originally developed to ease the automated deployment of Cloud application components, can be used in the context of a benchmarking methodology. The proposed methodology focuses on Workflow Management Systems (WfMSs), a critical service orchestration middleware, which can be characterized by its architectural complexity, for which Docker Containers offer a highly suitable approach. The contributions of our work are: 1) a new benchmarking approach taking full advantage of containerization technologies; and 2) the formalization of the interaction process with the WfMS vendors described clearly in a written agreement. Thus, we take advantage of emerging Cloud technologies to address technical challenges, ensuring the performance measurements can be trusted. We also make the benchmarking process transparent, automated, and repeatable so that WfMS vendors can join the benchmarking effort.


  1. Barbacci, M., Klein, M., et al. (1995). Quality attributes. Technical report, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA.
  2. Bianculli, D., Binder, W., and Drago, M. L. (2010a). SOABench: Performance evaluation of serviceoriented middleware made easy. In Proc. of the ICSE 7810, pages 301-302.
  3. Bianculli, D., Binder, W., et al. (2010b). Automated performance assessment for service-oriented middleware: A case study on BPEL engines. In Proc. of the WWW 7810, pages 141-150.
  4. Boettiger, C. (2014). An introduction to Docker for reproducible research, with examples from the R environment. ACM SIGOPS Operating Systems Review, Special Issue on Repeatability and Sharing of Experimental Artifacts, 49(1):71-79.
  5. Chhetri, M. B., Chichin, S., et al. (2015). Smart CloudBench - a framework for evaluating Cloud infrastructure performance. Information Systems Frontiers, pages 1-16.
  6. Daniel, F., Pozzi, G., and Zhang, Y. (2011). Workflow engine performance evaluation by a black-box approach. In Proc. of the International Conference on Informatics Engineering & Information Science (ICIEIS 7811), ICIEIS 7811, pages 189-203. Springer.
  7. Dean, J. and Ghemawat, S. (2008). Mapreduce: Simplified data processing on large clusters. Communications of the ACM, 51(1):107-113.
  8. Faban (2014). Faban. http://faban.org.
  9. Felter, W., Ferreira, A., Rajamony, R., and Rubio, J. (2014). An updated performance comparison of virtual machines and linux containers. Technical report, IBM.
  10. Fenton, N. and Bieman, J. (2014). Software metrics: a rigorous and practical approach. CRC Press, 3rd edition.
  11. Ferme, V., Ivanchikj, A., and Pautasso, C. (2015). A framework for benchmarking BPMN 2.0 workflow management systems. In Proc. of the 13th International Conference on Business Process Management, BPM 7815. Springer.
  12. Geiger, M., Harrer, S., et al. (2015). BPMN conformance in open source engines. In Proc. of the SOSE 2015, San Francisco Bay, CA, USA. IEEE.
  13. Gillmann, M., Mindermann, R., et al. (2000). Benchmarking and configuration of workflow management systems. In Proc. of the CoopIS 7800, pages 186-197.
  14. Gray, J. (1992). The Benchmark Handbook for Database and Transaction Systems. Morgan Kaufmann, 2nd edition.
  15. Hollingsworth, D. (1995). Workflow management coalition the workflow reference model. Workflow Management Coalition, 68.
  16. Huppler, K. (2009). The art of building a good benchmark. In Performance Evaluation and Benchmarking (TPCTC 2009), pages 18-30. Springer.
  17. Iosup, A., Prodan, R., et al. (2014). IaaS Cloud benchmarking: Approaches, challenges, and experience. In Cloud Computing for Data-Intensive Applications, pages 83 -104. Springer New York.
  18. Jain, R. K. (1991). The art of computer systems performance analysis: techniques for experimental design, measurement, simulation and modeling. Wiley, NY.
  19. Jordan, D. and Evdemon, J. (2011). Business Process Model And Notation (BPMN) version 2.0. Object Management Group. http://www.omg.org/ spec/BPMN/2.0/.
  20. Juszczyk, L., Truong, H.-L., et al. (2008). GENESIS - a framework for automatic generation and steering of testbeds of complex Web Services. In Proc. of the ICECCS 2008, pages 131-140.
  21. Leymann, F. (2011). BPEL vs. BPMN 2.0: Should you care? In Proc. of the BPMN 7810, volume 67, pages 8-13. Springer.
  22. Li, X., Huai, J., et al. (2009). SOArMetrics: A toolkit for testing and evaluating SOA middleware. In Proc. of SERVICES-1 7809, pages 163-170.
  23. Merkel, D. (2014). Docker: Lightweight linux containers for consistent development and deployment. Linux J., 2014(239).
  24. Molyneaux, I. (2014). The Art of Application Performance Testing: From Strategy to Tools. O'Reilly Media, Inc., 2nd edition.
  25. Parmenter, D. (2010). Key Performance Indicators (KPI): developing, implementing, and using winning KPIs. John Wiley & Sons.
  26. Pautasso, C., Ferme, V., et al. (2015). Towards workflow benchmarking: Open research challenges. In Proc. of the BTW 2015, pages 331-350.
  27. Puri, S. (2008). Recommendations for performance benchmarking. http://www.infosys.com/consulting/ architecture-services/white-papers/Documents/ performance-benchmarking-recommendations.pdf.
  28. Röck, C., Harrer, S., et al. (2014). Performance benchmarking of BPEL engines: A comparison framework, status quo evaluation and challenges. In Proc. of the SEKE 2014, pages 31-34.
  29. Russell, N., van der Aalst, W. M., and Hofstede, A. (2007). All that glitters is not gold: Selecting the right tool for your BPM needs. Cutter IT Journal, 20(11):31-38.
  30. Sim, S. E., Easterbrook, S., et al. (2003). Using benchmarking to advance research: A challenge to software engineering. In Proc. of the ICSE 7803, pages 74-83.
  31. Skouradaki, M., Roller, D. H., et al. (2015). On the road to benchmarking BPMN 2.0 workflow engines. In Proc. of the ICPE 7815, pages 301-304.
  32. The Apache Software Foundation (2015). JMeter. http://jmeter.apache.org.
  33. Transaction Processing Council (TPC) (1997). TPC Benchmark C (Online Transaction Processing Benchmark) Version 5.11.
  34. Transaction Processing Performance Council (2007). TPCE. http://www.tpc.org/tpce.
  35. Turnbull, J. (2014). The Docker Book: Containerization is the new virtualization.
  36. van Hoorn, A., Vøgele, C., et al. (2014). Automatic extraction of probabilistic workload specifications for load testing session-based application systems. In Proc. of the VALUETOOLS 2014, pages 139-146. ICST.
  37. Wetzstein, B., Leitner, P., et al. (2009). Monitoring and analyzing influential factors of business process performance. In Proc. of EDOC 7809, pages 141-150.

Paper Citation

in Harvard Style

Ferme V., Ivanchikj A., Pautasso C., Skouradaki M. and Leymann F. (2016). A Container-centric Methodology for Benchmarking Workflow Management Systems . In Proceedings of the 6th International Conference on Cloud Computing and Services Science - Volume 2: CLOSER, ISBN 978-989-758-182-3, pages 74-84. DOI: 10.5220/0005908400740084

in Bibtex Style

author={Vincenzo Ferme and Ana Ivanchikj and Cesare Pautasso and Marigianna Skouradaki and Frank Leymann},
title={A Container-centric Methodology for Benchmarking Workflow Management Systems},
booktitle={Proceedings of the 6th International Conference on Cloud Computing and Services Science - Volume 2: CLOSER,},

in EndNote Style

JO - Proceedings of the 6th International Conference on Cloud Computing and Services Science - Volume 2: CLOSER,
TI - A Container-centric Methodology for Benchmarking Workflow Management Systems
SN - 978-989-758-182-3
AU - Ferme V.
AU - Ivanchikj A.
AU - Pautasso C.
AU - Skouradaki M.
AU - Leymann F.
PY - 2016
SP - 74
EP - 84
DO - 10.5220/0005908400740084