measures, including firewalls, ports and protocols
restrictions, and some communication scanning may
be required. This process should, however, reduce
the attack space significantly. The implementation
(scheduled for summer 2014) will have to be
monitored and evaluated as it proceeds. This
includes the tracking of exploits, the response of the
software remediation system, and the degree to
which the vulnerability was knowable before the
exploit as well as newly discovered vulnerabilities.
It is expected that these analyses listed in this paper
will need refinement based upon that feedback.
Portions of this architecture are described in
(Simpson, 2011, 2012a, b).
REFERENCES
Common Criteria for Information Technology Security
Evaluation, 2009 (all version 3.1, revision 3):
a) Part 1: Introduction and general model.
b) Part 2: Functional security components.
c) Part 3: Assurance security components.
d) Common Methodology for Information
Technology Security Evaluation.
CMMI Institute, 2013, Standard CMMI Appraisal Method
for Process Improvement (SCAMPI) Version 1.3a:
Method Definition Document for SCAMPI A, B, and
C, http://cmmiinstitute.com/resource/standard-cmmi-
appraisal-method-process-improvement-scampi-b-c-
version-1-3a-method-definition-document/
Department of Defense, 2012a, Committee on National
Security Systems Instruction (CNSSI) No. 1253,
“Security Categorization and Control Selection for
National Security Systems’ categories for Moderate or
High Risk Impact as delineated in NIST 800-53.
Department of Defense, 2012b, DoD Directive (DoDD)
O-8530.1, Computer Network Defense (CND).
Finifter, Matthew, et. al., “An Empirical Study of
Vulnerability Rewards Programs”, USENIX Security
2013, August 15, 2013.
HP Security Tools, 2013, http://h20331.www2.hp.com/
hpsub/cache/281822-0-0-225-121.html?jumpid=ex_
2845_vanitysecur/productssecurity/ka011106
Huang, Y.-W., et. al., 2004, “Securing web application
code by static analysis and runtime protection,” in
WWW ’04: Proceedings of the 13th international
conference on World Wide Web. New York, NY,
USA: ACM, , pp. 40–52.
IBM Rational, 2013, http://www.03.ibm.com
/software/products /us/en/appscan
Intel Compilers, 2013, http://software.intel.com/en-
us/intel-compilers/
Kiezun, A., et. al., 2009, “Automatic creation of SQL
injection and cross-site scripting attacks,” in ICSE’09,
Proceedings of the 30th International Conference on
Software Engineering, Vancouver, BC, Canada, May
20–22.
Jones, Paul, 2010, "A Formal Methods-based verification
approach to medical device software analysis".
Embedded Systems Design., http://www.embedded.
com/design/prototyping-and-development/4008888/
A-Formal-Methods-based-verification-appro ach-to-
medical-device-software-analysis
Jovanovic, N., et. al., 2006, “Pixy: A static analysis tool
for detecting web application vulnerabilities (short
paper),” in 2006 IEEE Symposium on Security and
Privacy, pp. 258–263, [Online]. Available: http://
www.iseclab.org/papers/pixy.pdf
Kals, S., et. al., 2006, “Secubat: a web vulnerability
scanner,” in WWW ’06: Proc. 15th Int’l Conf. World
Wide Web, pp. 247–256.
Livshits, Benjamin, 2006, Improving Software Security
with Precise Static and Runtime Analysis, , section 7.3
"Static Techniques for Security," Stanford doc. thesis.
Livshits B., et. al., 2008, “Securing web applications with
static and dynamic information flow tracking,” in
PEPM ’08: Proceedings of the 2008 ACM SIGPLAN
symposium on Partial evaluation and semantics based
program manipulation. New York, NY, USA: ACM,
pp. 3–12.
Maggi, F., 2009, “Protecting a moving target: Addressing
web application concept drift,” in RAID, pp. 21–40.
Mitre, 2013a, Common Vulnerability and Exposures,
http://cve.mitre.org/
Mitre, 2013b, Common Weakness Enumeration,
http://cwe.mitre.org/
Mcallister, S., et. al., 2008, “Leveraging user interactions
for in-depth testing of web applications,” in RAID ’08:
Proc. 11th Int’l Symp. Recent Advances in Intrusion
Detection, pp. 191–210.
Mosaic, 2013, http://mosaicsecurity.com/categories/27-
network-penetration-testing
NIST, 2006, National Voluntary Laboratory Accreditation
Program, http://www.nist.gov/nvlap/upload/nist-
handbook-150.pdf
NIST, 2009, National Institute of Standards, Gaithersburg,
Md: FIPS PUB 800-53, Recommended Security
Controls for Federal Information Systems and
Organizations, Revision 3, August 2009.
NIST, 2013, National Vulnerability Database,
http://nvd.nist.gov/
Simpson, William R, et.al.., 2011, Lecture Notes in
Engineering and Computer Science, Proceedings
World Congress on Engineering and Computer
Science, Volume I, “High Assurance Challenges for
Cloud Computing”, pp. 61-66, Berkeley, CA.
Simpson, William R, and Chandersekaran, C.., 2012a,
Lecture Notes in Engineering and Computer Science,
Proceedings World Congress on Engineering, The
2012 International Conference of Information Security
and Internet Engineering, Volume I, “Claims-Based
Enterprise-Wide Access Control”, pp. 524-529,
London,.
Simpson, William R, and Chandersekaran, C.., 2012b,
International Journal of Scientific Computing, Vol. 6,
No. 2, “A Uniform Claims-Based Access Control for
the Enterprise”, ISSN: 0973-578X, pp. 1-23.
VulnerabilityandRemediationforaHigh-assuranceWeb-basedEnterprise
127