Abstract
Automated Static Analysis Tool (ASATs) are one of the best ways to search for vulnerabilities in applications, so they are a resource widely used by developers to improve their applications. However, it is well-known that the performance of such tools is limited, and their detection capabilities may not meet the requirements of the project regarding the criticality of the application. Diversity is an obvious direction to take to improve the true positives, as different tools usually report distinct vulnerabilities, however with the cost of also increasing the false positives, which may be unacceptable in some scenarios. In this paper, we study the problem of combining diverse ASATs to improve the overall detection of vulnerabilities in web applications, considering four development scenarios with different criticality goals and constraints. These scenarios range from low budget to high-end (e.g., business critical) web applications. We tested with five ASATs under two datasets, one with real WordPress plugins and another with synthetic test cases. Our findings revealed that combining the outputs of several ASATs does not always improve the vulnerability detection performance over a single ASAT. By using our procedure a developer is able to choose which is the best combination of ASATs that fits better in the project requirements.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
https://freeformdynamics.com/wp-content/uploads/legacy-pdfs/pdf/insidetrack/2017/17-03-Managing_Application_Security_Risk.pdf. Accessed 17 Mar 2017
https://www.owasp.org/index.php/Top_10_2017-Top_10. Accessed 20 Mar 2017
https://media.blackhat.com/bh-us-11/Willis/BH_US_11_WillisBritton_Analyzing_Static_Analysis_Tools_WP.pdf (2011). Accessed 6 Apr 2017
WPScan Vulnerability Database. https://wpvulndb.com/. Accessed 26 Oct 2015
Website hacked trend report 2016-Q1 (2016) https://sucuri.net/website-security/Reports/Sucuri-Website-Hacked-Report-2016Q1.pdf. Accessed 6 Apr 2017
Wordpress plugin directory. https://wordpress.org/plugins/. Accessed 29 Dec 2016
NIST SARD Project. http://samate.nist.gov/SRD. Accessed 23 Feb 2017
https://colorlib.com/wp/is-wordpress-websites-secure/. Accessed 09 March 2017
https://w3techs.com/technologies. Accessed March 2018
Antunes N, Vieira M (2015) On the metrics for benchmarking vulnerability detection tools. In: 2015 45th Annual IEEE/IFIP international conference on dependable systems and networks, pp 505–516
Backes M, Rieck K, Skoruppa M, Stock B, Yamaguchi F (2017) Efficient and flexible discovery of PHP application vulnerabilities. In: 2017 IEEE european symposium on security and privacy (EuroS&P), pp 334–349. IEEE. https://doi.org/10.1109/EuroSP.2017.14. http://ieeexplore.ieee.org/document/7961989/
Baggen R, Correia JP, Schill K, Visser J (2012) Standardized code quality benchmarking for improving software maintainability. Softw Qual J 20(2):287–307
Beller M, Bholanath R, McIntosh S, Zaidman A (2016) Analyzing the state of static analysis: a large-scale evaluation in open source software. In: 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering, vol 1, pp 470–481
Dahse J, Holz T (2014) Simulation of built-in PHP features for precise static code analysis. In: Proceedings 2014 network and distributed system security symposium. Internet Society, Reston, VA
Díaz G, Bermejo JR (2013) Static analysis of source code security: assessment of tools against SAMATE tests. Inf Softw Technol 55(8):1462–1476
Forbes: will the demand for developers continue to increase? https://forbes.com/sites/quora/2017/01/20/will-the-demand-for-developers-continue-to-increase/#7e502b681c3f. Accessed 15 May 2017
Goseva-Popstojanova K, Perhinschi A (2015) On the capability of static code analysis to detect security vulnerabilities. Inf Softw Technol 68:18–33
Hauzar D, Kofron J (2015) Framework for Static Analysis of PHP Applications. In: Boyland JT (ed) 29th European conference on object-oriented programming (ECOOP 2015), Leibniz international proceedings in informatics (LIPIcs), vol 37, pp 689–711. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, Dagstuhl, Germany
Imperva: Imperva web application attack report (WAAR). http://www.imperva.com/download.asp?id=509 (2015). Accessed 22 May 2017
Institute P (2015) Annual consumer studies. http://www.ponemon.org/. Accessed 22 May 2017
Johnson B, Song Y, Murphy-Hill E, Bowdidge R (2013) Why don’t software developers use static analysis tools to find bugs? In: 35th International conference on software engineering. IEEE, pp 672–681
Jovanovic N, Kruegel C, Kirda E (2006) Pixy: a static analysis tool for detecting web application vulnerabilities. In: 2006 IEEE symposium on security and privacy, pp 6–263
Landi W (1992) Undecidability of static analysis. ACM Lett Program Lang Syst 1(4):323–337
Livshits VB, Lam MS (2005) Finding security vulnerabilities in java applications with static analysis. In: Proceedings of the 14th conference on USENIX security symposium, vol 14, SSYM’05. USENIX Association, Berkeley, CA, USA, pp 18–18
Meade FG. https://samate.nist.gov/docs/CAS%202012%20Static%20Analysis%20Tool%20Study%20Methodology.pdf. Accessed 5 May 2017
Medeiros I, Neves NF, Correia M (2014) Automatic detection and correction of web application vulnerabilities using data mining to predict false positives. In: Proceedings of the 23rd international conference on world wide web, WWW ’14. ACM, NY, USA, pp 63–74
Meng N, Wang Q, Wu Q, Mei H (2008) An approach to merge results of multiple static analysis tools (short paper). In: 2008 The eighth international conference on quality software, pp 169–174
NIST: Software assurance metrics and tool evaluation. http://samate.nist.gov/. Accessed 28 Nov 2016
Nunes P. https://github.com/pjcnunes/Computing2018. Accessed 15 July 2018
Nunes P, Fonseca J, Vieira M (2015) phpSAFE: a security analysis tool for OOP web application plugins. In: 45th Annual IEEE/IFIP international conference on dependable systems and networks, DSN 2015, Rio de Janeiro, Brazil, June 22–25, 2015, pp 299–306
Nunes P, Medeiros I, Fonseca J, Neves N, Correia M, Vieira M (2017) On combining diverse static analysis tools for web security: an empirical study. In: 2017 13th European dependable computing conference (EDCC), pp 121–128
Pichler M. PHP depend. https://pdepend.org/. Accessed 03 Nov 2016
Rutar N, Almazan CB, Foster JS (2004) A comparison of bug finding tools for java. In: Proceedings of the 15th international symposium on software reliability engineering, ISSRE ’04. IEEE Computer Society, Washington, DC, USA, pp 245–256
Stivalet B, Fong E (2016) Large scale generation of complex and faulty PHP test cases. In: 2016 IEEE International conference on software testing, verification and validation (ICST), pp 409–415
Vogt P, Nentwich F, Jovanovic N, Kirda E, Kruegel C, Vigna G (2007) Cross site scripting prevention with dynamic data tainting and static analysis. In: NDSS, vol 2007, p 12
Wang Q, Meng N, Zhou Z, Li J, Mei H (2008) Towards SOA-based code defect analysis. In: IEEE international symposium on service-oriented system engineering, 2008. SOSE ’08, pp 269–274
Author information
Authors and Affiliations
Corresponding author
Additional information
This work extends a preliminary version presented at the 13th European Dependable Computing Conference (EDCC 2017)
Rights and permissions
About this article
Cite this article
Nunes, P., Medeiros, I., Fonseca, J. et al. An empirical study on combining diverse static analysis tools for web security vulnerabilities based on development scenarios. Computing 101, 161–185 (2019). https://doi.org/10.1007/s00607-018-0664-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00607-018-0664-z