skip to main content
10.1145/3407023.3409197acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaresConference Proceedingsconference-collections
research-article

A framework for automated evaluation of security metrics

Published:25 August 2020Publication History

ABSTRACT

Observation is the foundation of scientific experimentation. We consider observations to be measurements when they are quantified with respect to an agreed upon scale, or measurement unit. A number of metrics have been proposed in the literature which attempt to quantify some property of cyber security, but no systematic validation has been conducted to characterize the behaviour of these metrics as measurement instruments, or to understand how the quantity being measured is related to the security of the system under test. In this paper we broadly classify the body of available security metrics against the recently released Cyber Security Body of Knowledge, and identify common attributes across metric classes which may be useful anchors for comparison. We propose a general four stage evaluation pipeline to encapsulate the processing specifics of each metric, encouraging a separation of the actual measurement logic from the model it is often paired with in publication. Decoupling these stages allows us to systematically apply a range of inputs to a set of metrics, and we demonstrate some important results in our proof of concept. First, we determine a metric's suitability for use as a measurement instrument against validation criteria like operational range, sensitivity, and precision by observing performance over controlled variations of a reference input. Then we show how evaluating multiple metrics against common reference sets allows direct comparison of results and identification of patterns in measurement performance. Consequently, development and operations teams can also use this strategy to evaluate security tradeoffs between competing input designs or to measure the effects of incremental changes during production deployments.

References

  1. M Swanson, N Bartol, J Sabato, J Hash, and L Graffo. 2003. Security metrics guide for information technology systems. Number NIST SP 800-55, NIST SP 800--55. https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-55.pdf. Google ScholarGoogle ScholarCross RefCross Ref
  2. Peter Mell, Karen Scarfone, and Sasha Romanosky. 2007. The common vulnerability scoring system (CVSS) and its applicability to federal agency systems. Number NIST IR 7435, NIST IR 7435. https://nvlpubs.nist.gov/nistpubs/Legacy/IR/nistir7435.pdf. Google ScholarGoogle ScholarCross RefCross Ref
  3. Marcus Pendleton, Richard Garcia-Lebron, Jin-Hee Cho, and Shouhuai Xu. 2016. A survey on systems security metrics. ACM Computing Surveys, 49, 4, (December 2016), 1--35. issn: 03600300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Alex Ramos, Marcella Lazar, Raimir Holanda Filho, and Joel J. P. C. Rodrigues. 2017. Model-based quantitative network security metrics: a survey. IEEE Communications Surveys Tutorials, 19, 4, 2704--2734. issn: 2373-745X. Google ScholarGoogle ScholarCross RefCross Ref
  5. Vilhelm Verendel. 2009. Quantified security is a weak hypothesis: a critical survey of results and assumptions. In Proceedings of the 2009 workshop on New security paradigms workshop - NSPW '09. ACM Press, 37. isbn: 978-1-60558-8452. http://portal.acm.org/citation.cfm?doid=1719030.1719036. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Awais Rashid, Howard Chivers, George Danezis, Emil Lupu, and Andrew Martin. [n. d.] The cyber security body of knowledge, 854. tex.ids: rashidCyberSecurityBodya, rashidCyber-SecurityBodyb.Google ScholarGoogle Scholar
  7. R.B. Vaughn, R. Henning, and A. Siraj. 2003. Information assurance measures and metrics - state of practice and proposed taxonomy. In 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the. tex.ids: vaughn2003a tex.citation-number: 14. IEEE, 10 pp. isbn: 978-0-7695-1874-9. http://ieeexplore.ieee.org/document/1174904/. Google ScholarGoogle ScholarCross RefCross Ref
  8. Patrick Morrison, David Moye, Rahul Pandita, and Laurie Williams. 2018. Mapping the field of software life cycle security metrics. Information and Software Technology, 102, (October 2018), 146--159. issn: 09505849. Google ScholarGoogle ScholarCross RefCross Ref
  9. M. Rostami, F. Koushanfar, J. Rajendran, and R. Karri. 2013. Hardware security: threat models and metrics. In Proc. Int. Conf. Citation Key: rostami2013a tex.citation-number: 13. Computer Aided Design, 819--823.Google ScholarGoogle Scholar
  10. Masoud Rostami, Farinaz Koushanfar, and Ramesh Karri. 2014. A primer on hardware security: models, methods, and metrics. Proceedings of the IEEE, 102, 8, (August 2014), 1283--1295. tex.ids: rostamiPrimerHardwareSecurity2014a. issn: 1558-2256. Google ScholarGoogle ScholarCross RefCross Ref
  11. Alberto Medina, Anukool Lakhina, Ibrahim Matta, and John Byers. 2001. Brite: an approach to universal topology generation. In MASCOTS 2001, Proceedings Ninth International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunication Systems. IEEE, 346--353.Google ScholarGoogle ScholarCross RefCross Ref
  12. James Cowie, Andy Ogielski, and David Nicol. 2002. The ssfnet network simulator. Software on-line: http://www.ssfnet.org/homePage.html.Google ScholarGoogle Scholar
  13. Bob Lantz, Brandon Heller, and Nick McKeown. 2010. A network in a laptop: rapid prototyping for software-defined networks. In Proceedings of the Ninth ACM SIGCOMM Workshop on Hot Topics in Networks - Hotnets '10. ACM Press, 1--6. isbn: 978-1-4503-0409-2. http://portal.acm.org/citation.cfm?doid=1868447.1868466. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Alan S. Morris. 2001. Measurement and instrumentation principles. Butterworth-Heinemann. isbn: 978-0-7506-5081-6.Google ScholarGoogle Scholar
  15. Xinming Ou and Andrew W. Appel. 2005. A logic-programming approach to network security analysis. Princeton University Princeton.Google ScholarGoogle Scholar
  16. Daniel Hall. 2013. Ansible Configuration Management. Packt Publishing. isbn: 9781783280810.Google ScholarGoogle Scholar
  17. Steven Noel and Sushil Jajodia. 2014. Metrics suite for network attack graph analytics. In Proceedings of the 9th Annual Cyber and Information Security Research Conference on - CISR '14. ACM Press, 5--8. isbn: 978-1-4503-2812-8. http://dl.acm.org/citation.cfm?doid=2602087.2602117. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Marc Dacier and Yves Deswarte. 1994. Privilege graph: an extension to the typed access matrix model. In Proceedings of the Third European Symposium on Research in Computer Security (ESORICS '94). Springer-Verlag, London, UK, UK, 319--334. isbn: 3-540-58618-0. http://dl.acm.org/citation.cfm?id=646645.699167.Google ScholarGoogle ScholarCross RefCross Ref
  19. Rodolphe Ortalo, Yves Deswarte, and Mohamed Kaâniche. 1999. Experimenting with quantitative evaluation tools for monitoring operational security. IEEE Transactions on Software Engineering, 25, 5, 633--650.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Cynthia Phillips and Laura Painton Swiler. 1998. A graph-based system for network-vulnerability analysis. In Proceedings of the 1998 workshop on New security paradigms - NSPW '98. ACM Press, 71--79. isbn: 978-1-58113-168-0. http://portal.acm.org/citation.cfm?doid=310889.310919. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Marc Dacier, Yves Deswarte, and Mohamed Kaâniche. 1996. Quantitative assessment of operational security: models and tools. Information Systems Security, ed. by SK Katsikas and D. Gritzalis, London, Chapman&Hall, 179--86. tex.ids: dacierQuantitativeAssessmentOperational.Google ScholarGoogle Scholar
  22. Subil Abraham and Suku Nair. 2014. Cyber security analytics: a stochastic model for security quantification using absorbing markov chains. Journal of Communications, 9, 12, 899--907.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ARES '20: Proceedings of the 15th International Conference on Availability, Reliability and Security
    August 2020
    1073 pages
    ISBN:9781450388337
    DOI:10.1145/3407023
    • Program Chairs:
    • Melanie Volkamer,
    • Christian Wressnegger

    Copyright © 2020 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 25 August 2020

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Acceptance Rates

    Overall Acceptance Rate228of451submissions,51%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader