MEFORMA Security Evaluation Methodology - A Case Study

Ernő Jeges, Balázs Berkes, Balázs Kiss, Gergely Eberhardt

2014

Abstract

Even software engineers tend to forget about the fact that the burden of the security incidents we experience today stem from defects in the code – actually bugs – committed by them. Constrained by resources, many software vendors ignore security entirely until they face an incident, or are tackling security just by focusing on the options they think to be the cheapest – which usually means post-incident patching and automatic updates. Security, however, should be applied holistically, and should be interwoven into the entire product development lifecycle. Eliminating security problems is challenging, however; while engineers have to be vigilant and find every single bug in the code to make a product secure, an attacker only has to find a single remaining vulnerability to exploit it and take control of the entire system. This is why security evaluation is so different from functional testing, and why it needs to be performed by a well-prepared security expert. In this paper we will tackle the challenge of security testing, and introduce our methodology for evaluating the security of IT products – MEFORMA was specifically created as a framework for commercial security evaluations, and has already been proven in more than 50 projects over twelve years.

References

  1. Kirwan, G. and Power, A., 2012. The Psychology of Cyber Crime: Concepts and Principles, IGI Global.
  2. Moore, T., Clayton, R., and Anderson, R., 2009. The Economics of Online Crime, Journal of Economic Perspectives, American Economic Association, vol. 23 (3), pp. 3-20.
  3. Brooks, Jr.F.P., 1986. No Silver Bullet-Essence and Accidents of Software Engineering, Information Processing 86, Elsevier Science Publishers, pp. 1069- 1076.
  4. Andress, J., Winterfeld, S, 2011. Cyber Warfare: Techniques, Tactics and Tools for Security Practitioners, Syngress.
  5. Symantec Security Response, 2010. W32.Stuxnet. http://www.symantec.com/security_response/writeup.jsp? docid=2010-071400-3123-99; last visited: October 2013.
  6. Bencsáth, B., Pék, G., Buttyán, L., Félegyházi, M., 2012. The Cousins of Stuxnet: Duqu, Flame, and Gauss, Future Internet 2012, 4, pp. 971-1003.
  7. CERT Software Engineering Institute, 2010. Software Assurance Curriculum Project Volume I: Master of Software Assurance Reference Curriculum, Technical Report.
  8. Walther, J., 2004. Meeting the challenges of automated patch management, GSEC practical assignment.
  9. BSIMM, Building Security In Maturity Model, Release V, October 2013. http://bsimm.com/; last visited October 2013.
  10. CEM, 2012. Common Methodology for Information Technology Security Evaluation, v3.1, revision 4. http://www.commoncriteriaportal.org/files/ccfiles/CE MV3.1R4.pdf; last visited: October 2013.
  11. ASVS, 2013. OWASP Application Security Verification Standard 2013, version 2.0 beta, http://www.owasp.org/index.php/Category:OWASP_ Application_Security_Verification_Standard_Project; last visited October 2013.
  12. OSSTM, 2010. The Open Source Security Testing Methodology Manual , Contemporary Security Testing and Analysis, version 3, ISECOM, http://www.isecom.org/research/osstmm.html; last visited November 2013.
  13. Schneier, B., 1999. Attack Trees. Dr. Dobb's Journal, vol. 24, pp. 21 - 29.
  14. Alexander, I., 2002. Misuse cases: Use cases with hostile intent, Software, IEEE, Vol. 20, 1, pp. 58-66.
  15. ENISA European Network and Information Security Agency, 2012. ENISA Threat Landscape, Responding to the Evolving Threat Environment, September 2012.
  16. Kicillof, N., Grieskamp, W., Tillmann, N., Braberman V. A., 2007. Achieving both model and code coverage with automated gray-box testing, in: 3rd Workshop on Advances in Model-Based Testing, A-MOST'07, ACM Press, pp. 1-11.
  17. Miller, B. P., Fredriksen, L. and So, B., 1990. An Empirical Study of the Reliability of UNIX Utilities, Communications of the ACM 33, pp. 32-44.
  18. Kühn, U., Pyshkin, A., Tews, E. and Weinmann, R., 2008. Variants of Bleichenbacher's Low-Exponent Attack on PKCS#1 RSA Signatures, Proc. Sicherheit 2008, pp. 97-109.
Download


Paper Citation


in Harvard Style

Jeges E., Berkes B., Eberhardt G. and Kiss B. (2014). MEFORMA Security Evaluation Methodology - A Case Study . In Proceedings of the 4th International Conference on Pervasive and Embedded Computing and Communication Systems - Volume 1: MeSeCCS, (PECCS 2014) ISBN 978-989-758-000-0, pages 267-274. DOI: 10.5220/0004919902670274


in Bibtex Style

@conference{meseccs14,
author={Ernő Jeges and Balázs Berkes and Gergely Eberhardt and Balázs Kiss},
title={MEFORMA Security Evaluation Methodology - A Case Study},
booktitle={Proceedings of the 4th International Conference on Pervasive and Embedded Computing and Communication Systems - Volume 1: MeSeCCS, (PECCS 2014)},
year={2014},
pages={267-274},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004919902670274},
isbn={978-989-758-000-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 4th International Conference on Pervasive and Embedded Computing and Communication Systems - Volume 1: MeSeCCS, (PECCS 2014)
TI - MEFORMA Security Evaluation Methodology - A Case Study
SN - 978-989-758-000-0
AU - Jeges E.
AU - Berkes B.
AU - Eberhardt G.
AU - Kiss B.
PY - 2014
SP - 267
EP - 274
DO - 10.5220/0004919902670274