skip to main content
10.1145/2381913.2381932acmconferencesArticle/Chapter ViewAbstractPublication PagesccsConference Proceedingsconference-collections
research-article

Benchmarking cloud security level agreements using quantitative policy trees

Published:19 October 2012Publication History

ABSTRACT

While the many economic and technological advantages of Cloud computing are apparent, the migration of key sector applications onto it has been limited, in part, due to the lack of security assurance on the Cloud Service Provider (CSP). However, the recent efforts on specification of security statements in Service Level Agreements, also known as "Security Level Agreements" or SecLAs is a positive development. While a consistent notion of Cloud SecLAs is still developing, already some major CSPs are creating and storing their advocated SecLAs in publicly available repositories e.g., the Cloud Security Alliance's "Security, Trust & Assurance Registry" (CSA STAR). While several academic and industrial efforts are developing the methods to build and specify Cloud SecLAs, very few works deal with the techniques to quantitatively reason about SecLAs in order to provide security assurance. This paper proposes a method to benchmark - both quantitatively and qualitatively -- the Cloud SecLAs of one or more CSPs with respect to a user-defined requirement, also in the form of a SecLA. The contributed security benchmark methodology rests on the notion of Quantitative Policy Trees (QPT), a data structure that we propose to represent and systematically reason about SecLAs. In this paper we perform the initial validation of the contributed methodology with respect to another state of the art proposal, which in turn was empirically validated using the SecLAs stored on the CSA STAR repository. Finally, our research also contributes with QUANTS-as-a-Service (QUANTSaaS), a system that implements the proposed Cloud SecLA benchmark methodology.

References

  1. Almorsy, M., et.al. Collaboration-Based Cloud Computing Security Management Framework. In Proc. of IEEE Intl Conference on Cloud Computing, pages 364--371, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Andrieux, K., et.al. Web Services Agreement Specification (WS-Agreement). Technical Report TR-WSAgreement-2007, Open Grid Forum, 2007.Google ScholarGoogle Scholar
  3. Bernsmed, K., et.al. Security SLAs for Federated Cloud Services. In Proc. of IEEE Availability, Reliability and Security, pages 202--209, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Binnig, C., et. al. How is the weather tomorrow?: towards a benchmark for the cloud. In Proc. of the ACM Workshop on Testing Database Systems, pages 9:1--9:6, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bistarelli, S., et. al. Defense trees for economic evaluation of security investments. In Proc. of Availability, Reliability and Security, pages 8--16, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Casola, V., et.al. Interoperable Grid PKIs Among Untrusted Domains: An Architectural Proposal. In Advances in Grid and Pervasive Computing, volume 4459 of Springer LNCS, pages 39--51. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Casola, V., et.al. A Reference Model for Security Level Evaluation: Policy and Fuzzy Techniques. Journal of Universal Computer Science, pages 150--174, 2005.Google ScholarGoogle Scholar
  8. Casola, V., et.al. A SLA evaluation methodology in Service Oriented Architectures. In Quality of Protection, volume 23 of Springer Advances in Information Security, pages 119--130. 2006.Google ScholarGoogle Scholar
  9. Cloud Security Alliance. The Consensus Assessments Initiative Questionnaire. Online: https://cloudsecurityalliance.org/research/cai/, 2011.Google ScholarGoogle Scholar
  10. Cloud Security Alliance. The Security, Trust & Assurance Registry (STAR). Online: https://cloudsecurityalliance.org/star/, 2011.Google ScholarGoogle Scholar
  11. Cloud Security Alliance. Security and Privacy Level Agreements working groups. Online: https://cloudsecurityalliance.org/research/pla/, 2012.Google ScholarGoogle Scholar
  12. Dekker, M. and Hogben, G. Survey and analysis of security parameters in cloud SLAs across the European public sector. Technical Report TR-2011-12-19, European Network and Information Security Agency, 2011.Google ScholarGoogle Scholar
  13. Dumitras, T. and Shou, D. Toward a standard benchmark for computer security research: the worldwide intelligence network environment (WINE). In Proc. of the ACM BADGERS Workshop, pages 89--96, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Forum of Incident Response and Security Teams. CVSS-Common Vulnerability Scoring System. Online: http://www.first.org/cvss/, 2012.Google ScholarGoogle Scholar
  15. Henning, R. Security service level agreements: quantifiable security for the enterprise? In Proc. of ACM Workshop on New security paradigms, pages 54--60, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Hoff, C., et.al. CloudAudit 1.0 - Automated Audit, Assertion, Assessment, and Assurance API (A6). Technical Report draft-hoff-cloudaudit-00, IETF, 2011.Google ScholarGoogle Scholar
  17. Irvine, C. and Levin, T. Quality of security service. In Proc. of ACM Workshop on New security paradigms, pages 91--99, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jansen, W. Directions in security metrics research. Technical Report TR-7564, National Institute for Standards and Technology, 2010.Google ScholarGoogle Scholar
  19. Livshits, B. Stanford SecuriBench. Online: http://suif.stanford.edu/livshits/securibench/, 2005.Google ScholarGoogle Scholar
  20. Livshits, B. and Lam, M. Finding security errors in Java programs with static analysis. In Proc. of Usenix Security Conference, pages 18--18, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Ludwig, H., et.al. Web Service Level Agreement (WSLA) Language Specification. Technical Report TR-WSLA-2003-01-28, IBM, 2003.Google ScholarGoogle Scholar
  22. Luna, J. et.al. Providing security to the Desktop Data Grid. In Proc. of IEEE Symposium on Parallel and Distributed Processing, pages 1--8, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  23. Luna, J., et.al. A Security Metrics Framework for the Cloud. In Proc. of Security and Cryptography, pages 245--250, 2011.Google ScholarGoogle Scholar
  24. Luna, J., et.al. Quantitative Assessment of Cloud Security Level Agreements: A Case Study. In Proc. of Security and Cryptography, (In Press).Google ScholarGoogle Scholar
  25. mOSAIC. mOSAIC FP7. Online: http://www.mosaic-cloud.eu/, 2011.Google ScholarGoogle Scholar
  26. Neto, A., et.al. To benchmark or not to benchmark security: That is the question. In Proc. of DSN HoTDep Workshop, pages 182--187, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Parrend, P. and Frenot, S. Security benchmarks of OSGi platforms: toward Hardened OSGi. Softw. Pract. Exper., 39:471--479, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Poe, J. and Li, T. BASS: a benchmark suite for evaluating architectural security systems. SIGARCH Comput. Archit. News, 34(4):26--33, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Samani, R., et.al. Common Assurance Maturity Model: Scoring Model. Online: http://common-assurance.com/, 2011.Google ScholarGoogle Scholar
  30. Savola, R., et.al. Towards Wider Cloud Service Applicability by Security, Privacy and Trust Measurements. In Proc. of IEEE Application of Information and Communication Technologies, pages 1--6, 2010.Google ScholarGoogle Scholar
  31. Schneier, B. Assurance. Online: http://www.schneier.com/blog/archives/2007/08/assurance.html, 2007.Google ScholarGoogle Scholar
  32. Weisstein, W. Frobenius Norm. Online: http://mathworld.wolfram.com/FrobeniusNorm.html, 2011.Google ScholarGoogle Scholar
  33. Weisstein, W. L1-Norm. Online: http://mathworld.wolfram.com/L1-Norm.html, 2011.Google ScholarGoogle Scholar

Index Terms

  1. Benchmarking cloud security level agreements using quantitative policy trees

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CCSW '12: Proceedings of the 2012 ACM Workshop on Cloud computing security workshop
      October 2012
      134 pages
      ISBN:9781450316651
      DOI:10.1145/2381913

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 October 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate37of108submissions,34%

      Upcoming Conference

      CCS '24
      ACM SIGSAC Conference on Computer and Communications Security
      October 14 - 18, 2024
      Salt Lake City , UT , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader