Skip to main content

A Self-adaptive Approach for Assessing the Criticality of Security-Related Static Analysis Alerts

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12955))

Abstract

Despite the acknowledged ability of automated static analysis to detect software vulnerabilities, its adoption in practice is limited, mainly due to the large number of false alerts (i.e., false positives) that it generates. Although several machine learning-based techniques for assessing the actionability of the produced alerts and for filtering out false positives have been proposed, none of them have demonstrated sufficient results, whereas limited attempts focus on assessing the criticality of the alerts from a security viewpoint. To this end, in the present paper we propose an approach for assessing the criticality of security-related static analysis alerts. In particular, we develop a machine learning-based technique for prioritizing and classifying security-related static analysis alerts based on their criticality, by considering information retrieved from the alerts themselves, vulnerability prediction models, and user feedback. The concept of retraining is also adopted to enable the model to correct itself and adapt to previously unknown software products. The technique has been evaluated through a case study, which revealed its capacity to effectively assess the criticality of alerts of previously unknown projects, as well as its ability to dynamically adapt to the characteristics of the new project and provide more accurate assessments through retraining.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://www.bsimm.com/.

  2. 2.

    https://owasp.org/www-project-dependency-check/.

  3. 3.

    https://pmd.github.io/.

  4. 4.

    https://hub.docker.com/repository/registry-1.docker.io/iliakalo/evit-image/.

  5. 5.

    https://gitlab.com/iliaskalou/aait/-/wikis/Exploitable-Vulnerability-Identification-Wiki.

References

  1. Luszcz, J.: Apache struts 2: how technical and development gaps caused the equifax breach. Netw. Secur. 2018(1), 5–8 (2018)

    Article  Google Scholar 

  2. Siavvas, M., Gelenbe, E., Kehagias, D., Tzovaras, D.: Static analysis-based approaches for secure software development. In: Gelenbe, E., et al. (eds.) Euro-CYBERSEC 2018. CCIS, vol. 821, pp. 142–157. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-95189-8_13

    Chapter  Google Scholar 

  3. Mohammed, N.M., Niazi, M., Alshayeb, M., Mahmood, S.: Exploring software security approaches in software development lifecycle: a systematic mapping study. Comp. Stand. Interf. 50, 107–115 (2016)

    Article  Google Scholar 

  4. Baca, D.: Identifying security relevant warnings from static code analysis tools through code tainting. In: 2010 International Conference on Availability, Reliability and Security, pp. 386–390. IEEE (2010)

    Google Scholar 

  5. Yang, J., Ryu, D., Baik, J.: Improving vulnerability prediction accuracy with secure coding standard violation measures. In: 2016 International Conference on Big Data and Smart Computing (BigComp), pp. 115–122. IEEE (2016)

    Google Scholar 

  6. McGraw, G.: Software security. Datenschutz und Datensicherheit - DuD (2012)

    Google Scholar 

  7. Howard, M., Lipner, S.: The Security Development Lifecycle: SDL: A Process for Developing Demonstrably More Secure Software. Microsoft Press (2006)

    Google Scholar 

  8. Johnson, B., Song, Y., Murphy-Hill, E., Bowdidge, R.: Why don’t software developers use static analysis tools to find bugs? In: 2013 35th International Conference on Software Engineering (ICSE), pp. 672–681. IEEE (2013)

    Google Scholar 

  9. Vassallo, C., Panichella, S., Palomba, F., Proksch, S., Gall, H.C., Zaidman, A.: How developers engage with static analysis tools in different contexts. Empirical Softw. Eng. 25(2), 1419–1457 (2019). https://doi.org/10.1007/s10664-019-09750-5

    Article  Google Scholar 

  10. Muske, T., Serebrenik, A.: Survey of approaches for handling static analysis alarms. In: 2016 IEEE 16th International Working Conference on Source Code Analysis and Manipulation (SCAM). pp. 157–166. IEEE (2016)

    Google Scholar 

  11. Heckman, S., Williams, L.: A systematic literature review of actionable alert identification techniques for automated static code analysis. Inf. and Soft, Tech (2011)

    Google Scholar 

  12. Yang, X., Chen, J., Yedida, R., Yu, Z., Menzies, T.: Learning to recognize actionable static code warnings. Empirical Softw. Eng. 26, 56 (2021). https://doi.org/10.1007/s10664-021-09948-6

    Article  Google Scholar 

  13. Munaiah, N., Camilo, F., Wigham, W., Meneely, A., Nagappan, M.: Do bugs foreshadow vulnerabilities? An in-depth study of the chromium project. Empirical Softw. Eng. 22(3), 1305–1347 (2017)

    Article  Google Scholar 

  14. Heckman, S., Williams, L.: A comparative evaluation of static analysis actionable alert identification techniques. In: Proceedings of the 9th International Conference on Predictive Models in Software Engineering, pp. 1–10 (2013)

    Google Scholar 

  15. Misra, S.: A step by step guide for choosing project topics and writing research papers in ICT related disciplines. In: ICTA 2020. CCIS, vol. 1350, pp. 727–744. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-69143-1_55

    Chapter  Google Scholar 

  16. Heckman, S., Williams, L.: A model building process for identifying actionable static analysis alerts. In: 2009 International Conference on Software Testing Verification and Validation, pp. 161–170 (2009)

    Google Scholar 

  17. Heckman, S.S.: Adaptively ranking alerts generated from automated static analysis. XRDS: Crossroads. ACM Mag. Stud. 14(1), 1–11 (2007)

    Google Scholar 

  18. Ruthruff, J.R., Penix, J., Morgenthaler, J.D., Elbaum, S., Rothermel, G.: Predicting accurate and actionable static analysis warnings: an experimental approach. In: Proceedings of the 30th International Conference on Software Engineering. ICSE 2008. Association for Computing Machinery, New York, pp. 341–350 (2008)

    Google Scholar 

  19. Kremenek, T., Ashcraft, K., Yang, J., Engler, D.: Correlation exploitation in error ranking. In: Proceedings of the 12th ACM SIGSOFT Twelfth International Symposium on Foundations of Software Engineering. SIGSOFT 2004/FSE-12. Association for Computing Machinery, New York, pp. 83–93 (2004)

    Google Scholar 

  20. Tripp, O., Guarnieri, S., Pistoia, M., Aravkin, A.: ALETHEIA: improving the usability of static security analysis. In: Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (2014)

    Google Scholar 

  21. Heckman, S., Williams, L.: On establishing a benchmark for evaluating static analysis alert prioritization and classification techniques. In: 2nd International Symposium on Empirical Software Engineering and Measurement (2008)

    Google Scholar 

  22. Younis, A.A., Malaiya, Y.K., Ray, I.: Using attack surface entry points and reachability analysis to assess the risk of software vulnerability exploitability. In: 15th International Symposium on High-Assurance Systems Engineering (2014)

    Google Scholar 

  23. Younis, A.A., Malaiya, Y.K.: Using software structure to predict vulnerability exploitation potential. In: 8th International Conference on Software Security and Reliability-Companion, pp. 13–18 (2014)

    Google Scholar 

  24. Siavvas, M., Kehagias, D., Tzovaras, D., Gelenbe, E.: A hierarchical model for quantifying software security based on static analysis alerts and software metrics. Softw. Qual. J. 29(2), 431–507 (2021). https://doi.org/10.1007/s11219-021-09555-0

    Article  Google Scholar 

  25. Kalouptsoglou, I., Siavvas, M., Tsoukalas, D., Kehagias, D.: Cross-project vulnerability prediction based on software metrics and deep learning. In: Gervasi, O., et al. (eds.) ICCSA 2020. LNCS, vol. 12252, pp. 877–893. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58811-3_62

    Chapter  Google Scholar 

  26. Filus, K., Siavvas, M., Domańska, J., Gelenbe, E.: The random neural network as a bonding model for software vulnerability prediction. In: Modelling, Analysis, and Simulation of Computer and Telecommunication Systems (2021)

    Google Scholar 

  27. Filus, K., Boryszko, P., Domańska, J., Siavvas, M., Gelenbe, E.: Efficient feature selection for static analysis vulnerability prediction. Sensors 21(4), 1133 (2021)

    Article  Google Scholar 

  28. Siavvas, M.G., Chatzidimitriou, K.C., Symeonidis, A.L.: QATCH-an adaptive framework for software product quality assessment. Expert Syst. Appl. 86, 350–366 (2017)

    Article  Google Scholar 

  29. Siavvas, M., Kehagias, D., Tzovaras, D.: A preliminary study on the relationship among software metrics and specific vulnerability types. In: 2017 International Conference on Computational Science and Computational Intelligence (2017)

    Google Scholar 

  30. Mateos, C., Zunino, A., Misra, S., Anabalon, D., Flores, A.: Migration from COBOL to SOA: measuring the impact on web services interfaces complexity. In: Damaševičius, R., Mikašytė, V. (eds.) ICIST 2017. CCIS, vol. 756, pp. 266–279. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67642-5_22

    Chapter  Google Scholar 

  31. Mateos, C., Zunino, A., Flores, A., Misra, S.: Cobol systems migration to SOA: assessing antipatterns and complexity. Inf. Technol. Control 48, 71–89 (2019)

    Google Scholar 

Download references

Acknowledgements

This work is partially funded by the European Union’s Horizon 2020 Research and Innovation Programme through IoTAC project under Grant Agreement No. 952684.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Miltiadis Siavvas .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Siavvas, M., Kalouptsoglou, I., Tsoukalas, D., Kehagias, D. (2021). A Self-adaptive Approach for Assessing the Criticality of Security-Related Static Analysis Alerts. In: Gervasi, O., et al. Computational Science and Its Applications – ICCSA 2021. ICCSA 2021. Lecture Notes in Computer Science(), vol 12955. Springer, Cham. https://doi.org/10.1007/978-3-030-87007-2_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-87007-2_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-87006-5

  • Online ISBN: 978-3-030-87007-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics