skip to main content
10.1145/3106237.3121276acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
short-paper

FOSS version differentiation as a benchmark for static analysis security testing tools

Published: 21 August 2017 Publication History

Abstract

We propose a novel methodology that allows automatic construction of benchmarks for Static Analysis Security Testing (SAST) tools based on real-world software projects by differencing vulnerable and fixed versions in FOSS repositories. The methodology allows us to evaluate ``actual'' performance of SAST tools (without unrelated alarms). To test our approach, we benchmarked 7 SAST tools (although we report only results for the two best tools), against 70 revisions of four major versions of Apache Tomcat with 62 distinct CVEs as the source of ground truth vulnerabilities.

References

[1]
National Security Agency Center for Assured Software (NSA CAS). 2012. Juliet Test Suite v1.2 for Java User Guide. (2012).
[2]
Nuno Antunes and Marco Vieira. 2015. Assessing and Comparing Vulnerability Detection Tools for Web Services: Benchmarking Approach and Examples. 8, 2 (2015), 269–283.
[3]
Joao Eduardo M. Araujo, Silvio Souza, and Marco Tulio Valente. 2011. Study on the relevance of the warnings reported by Java bug-finding tools. 5, 4 (2011), 2 This information is taken from https://www.openhub.net/ (last accessed on June 2017). 366–374.
[4]
Nathaniel Ayewah, William Pugh, J. David Morgenthaler, John Penix, and YuQian Zhou. 2007. Evaluating static analysis defect warnings on production software.
[5]
Paul E. Black and Athos Ribeiro. 2016. SATE V Ockham Sound Analysis Criteria. Technical Report. National Institute of Standards and Technology (NIST).
[6]
Cristian Cadar and Alastair F. Donaldson. 2016. Analysing the Program Analyser (ICSE ’16). ACM, New York, NY, USA, 765–768.
[7]
[8]
Aurelien Delaitre, Bertrand Stivalet, Elizabeth Fong, and Vadim Okun. 2015. Evaluating Bug Finders–Test and Measurement of Static Code Analyzers.
[9]
Lisa Nguyen Quang Do, Michael Eichberg, and Eric Bodden. 2016. Toward an automated benchmark management system. ACM, 13–17.
[10]
Brendan Dolan-Gavitt, Patrick Hulin, Engin Kirda, Tim Leek, Andrea Mambretti, Wil Robertson, Frederick Ulrich, and Ryan Whelan. 2016. LAVA: Large-scale automated vulnerability addition.
[11]
Pär Emanuelsson and Ulf Nilsson. 2008. A comparative study of industrial static analysis tools. 216 (2008), 5–21.
[12]
Martin Johns and Moritz Jodeit. 2011. Scanstud: a methodology for systematic, fine-grained evaluation of static analysis tools.
[13]
James A Kupsch and Barton P Miller. 2009. Manual vs. automated vulnerability assessment: A case study. 83–97.
[14]
Daoyuan Li, Li Li, Dongsun Kim, Tegawendé F Bissyandé, David Lo, and Yves Le Traon. 2016. Watch out for This Commit! A Study of Influential Software Changes. arXiv preprint arXiv:1606.03266 (2016).
[15]
Peng Li and Baojiang Cui. 2010. A comparative study on software vulnerability static analysis techniques and tools.
[16]
Benjamin Livshits. 2005. Stanford SecuriBench. Online: http://suif. stanford. edu/livshits/securibench (2005).
[17]
Viet Hung Nguyen, Stanislav Dashevskyi, and Fabio Massacci. 2015. An automatic method for assessing the versions affected by a vulnerability. 21, 6 (2015), 2268?–2297.
[18]
Viet Hung Nguyen, Stanislav Dashevskyi, and Fabio Massacci. 2016. An automatic method for assessing the versions affected by a vulnerability. Empirical Software Engineering 21, 6 (2016), 2268–2297.
[19]
NIST. 2016. SAMATE list of Source Code Security Analyzers. (2016). https: //samate.nist.gov/index.php/Source_Code_Security_Analyzers.html
[20]
Vadim Okun, Aurelien Delaitre, and Paul E. Black. 2010. The second static analysis tool exposition (SATE) 2009. (2010), 500–287.
[21]
Vadim Okun, Aurelien Delaitre, and Paul E. Black. 2011. Report on the Third Static Analysis Tool Exposition (SATE 2010). (2011), 500–283.
[22]
Vadim Okun, Aurelien Delaitre, and Paul E. Black. 2013. Report on the static analysis tool exposition (SATE) IV. 500 (2013), 297.
[23]
Vadim Okun, Romain Gaucher, and Paul E. Black. 2009. Static analysis tool exposition (SATE) 2008. 5, 00-2 (2009), 79.
[24]
OWASP. 2017. OWASP list of Source Code Analysis Tools. (2017). https: //www.owasp.org/index.php/Source_Code_Analysis_Tools
[25]
Latifa Ben Arfa Rabai, Barry Cohen, and Ali Mili. 2015. Programming Language Use in US Academia and Industry. 14, 2 (2015), 143.
[26]
Michael Reif, Michael Eichberg, Ben Hermann, and Mira Mezini. 2017. Hermes: assessment and creation of effective test corpora. ACM, 43–48.
[27]
Joseph R. Ruthruff, John Penix, J. David Morgenthaler, Sebastian Elbaum, and Gregg Rothermel. 2008. Predicting accurate and actionable static analysis warnings: an experimental approach.
[28]
David Wheeler. 2015. Static analysis tools for security. (2015). http://www. dwheeler.com/essays/static-analysis-tools.html
[29]
John Wilander and Mariam Kamkar. 2002. A comparison of publicly available tools for static intrusion prevention. (2002).
[30]
Abstract 1 Research problem & Motivation 2 Background & Related Work 3 Approach & Uniqueness 4 Results & Contributions References

Cited By

View all
  • (2022)On the adoption of static analysis for software security assessment–A case study of an open-source e-government projectComputers and Security10.1016/j.cose.2021.102470111:COnline publication date: 9-Apr-2022
  • (2021)On the Combination of Static Analysis for Software Security Assessment – A Case Study of an Open-Source e-Government ProjectAdvances in Science, Technology and Engineering Systems Journal10.25046/aj06021056:2(921-932)Online publication date: Apr-2021

Index Terms

  1. FOSS version differentiation as a benchmark for static analysis security testing tools

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ESEC/FSE 2017: Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering
    August 2017
    1073 pages
    ISBN:9781450351058
    DOI:10.1145/3106237
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 August 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Large-scale Benchmark
    2. Software Security
    3. Static Analysis
    4. Static Application Security Testing Tool
    5. Vulnerability

    Qualifiers

    • Short-paper

    Conference

    ESEC/FSE'17
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 112 of 543 submissions, 21%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 20 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)On the adoption of static analysis for software security assessment–A case study of an open-source e-government projectComputers and Security10.1016/j.cose.2021.102470111:COnline publication date: 9-Apr-2022
    • (2021)On the Combination of Static Analysis for Software Security Assessment – A Case Study of an Open-Source e-Government ProjectAdvances in Science, Technology and Engineering Systems Journal10.25046/aj06021056:2(921-932)Online publication date: Apr-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media