Skip to main content

An Approach to Static-Dynamic Software Analysis

  • Conference paper
  • First Online:
Formal Techniques for Safety-Critical Systems (FTSCS 2015)

Abstract

Safety-critical software in industry is typically subjected to both dynamic testing as well as static program analysis. However, while testing is expensive to scale, static analysis is prone to false positives and/or false negatives. In this work we propose a solution based on a combination of static analysis to zoom into potential bug candidates in large code bases and symbolic execution to confirm these bugs and create concrete witnesses. Our proposed approach is intended to maintain scalability while improving precision and as such remedy the shortcomings of each individual solution. Moreover, we developed the SEEKFAULT tool that creates local symbolic execution targets from static analysis bug candidates and evaluate its effectiveness on the SV-COMP loop benchmarks. We show that a conservative tuning can achieve a 98 % detecting rate in that benchmark while at the same time reducing false positive rates by around 50 % compared to a singular static analysis approach.

Pablo gratefully thanks the funding and support of DATA61 and the Australian Government as a Research intern and Fellow Student. Authors acknowledge the funding from projects TEC2011-28666-C04-02, TEC2014-58036-C4-3-R and grant BES-2012-055572, awarded by the Spanish Ministry of Economy and Competitivity.

Funded by the Australian Government through the Department of Communications and the Australian Research Council through the ICT Centre of Excellence Program.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://sv-comp.sosy-lab.org/.

References

  1. MISRA Ltd: MISRA-C:2004 Guidelines for the use of the C language in Critical Systems. MISRA, October 2004

    Google Scholar 

  2. Miller, C., Valasek, C.: A survey of remote automotive attack surfaces. Black Hat USA (2014)

    Google Scholar 

  3. Checkoway, S., McCoy, D., Kantor, B., Anderson, D., Shacham, H., Savage, S., Koscher, K., Czeskis, A., Roesner, F., Kohno, T., et al.: Comprehensive experimental analyses of automotive attack surfaces. In: USENIX Security Symposium, San Francisco (2011)

    Google Scholar 

  4. Huuck, R.: Technology transfer: formal analysis, engineering, and business value. Sci. Comput. Program. 103, 3–12 (2015)

    Article  Google Scholar 

  5. Gonzalez-de-Aledo, P., Sanchez, P.: Framework for embedded system verification. In: Baier, C., Tinelli, C. (eds.) TACAS 2015. LNCS, vol. 9035, pp. 429–431. Springer, Heidelberg (2015)

    Google Scholar 

  6. Clarke, L.A.: A system to generate test data and symbolically execute programs. IEEE Trans. Softw. Eng. 2(3), 215–222 (1976)

    Article  Google Scholar 

  7. Sen, K., Marinov, D., Agha, G.: CUTE: a concolic unit testing engine for C. In: Proceedings of the 10th European Software Engineering Conference, ESEC/FSE-13, pp. 263–272. ACM, New York, NY, USA (2005)

    Google Scholar 

  8. Godefroid, P., Klarlund, N., Sen, K.: Dart: directed automated random testing. In: Programming Language Design and Implementation (PLDI) (2005)

    Google Scholar 

  9. Qu, X., Robinson, B.: A case study of concolic testing tools and their limitations. In: International Symposium on Empirical Software Engineering and Measurement (ESEM), pp. 117–126, September 2011

    Google Scholar 

  10. Cadar, C., Godefroid, P., Khurshid, S., Păsăreanu, C.S., Sen, K., Tillmann, N., Visser, W.: Symbolic execution for software testing in practice: preliminary assessment. In: Proceedings of the 33rd International Conference on Software Engineering, ICSE 2011, pp. 1066–1071. ACM, New York (2011)

    Google Scholar 

  11. Bessey, A., Block, K., Chelf, B., Chou, A., Fulton, B., Hallem, S., Henri-Gros, C., Kamsky, A., McPeak, S., Engler, D.: A few billion lines of code later: using static analysis to find bugs in the real world. Commun. ACM 53(2), 66–75 (2010)

    Article  Google Scholar 

  12. GrammaTech: CodeSurfer. http://www.grammatech.com/

  13. O’Hearn, P.W., Calcagno, C., Distefano, D., Lee, O., Cook, B., Yang, H., Berdine, J.: Scalable shape analysis for systems code. In: Gupta, A., Malik, S. (eds.) CAV 2008. LNCS, vol. 5123, pp. 385–398. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  14. Junker, M., Huuck, R., Fehnker, A., Knapp, A.: SMT-based false positive elimination in static program analysis. In: Aoki, T., Taguchi, K. (eds.) ICFEM 2012. LNCS, vol. 7635, pp. 316–331. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  15. Marre, B., Mouy, P., Williams, N., Roger, M.: PathCrawler: automatic generation of path tests by combining static and dynamic analysis. In: Dal Cin, M., Kaâniche, M., Pataricza, A. (eds.) EDCC 2005. LNCS, vol. 3463, pp. 281–292. Springer, Heidelberg (2005)

    Google Scholar 

  16. Cadar, C., Sen, K.: Symbolic execution for software testing: three decades later. Commun. ACM 56(2), 82–90 (2013)

    Article  Google Scholar 

  17. Escalona, M.J., Gutierrez, J.J., Mejías, M., Aragón, G., Ramos, I., Torres, J., Domínguez, F.J.: An overview on test generation from functional requirements. J. Syst. Softw. 84(8), 1379–1393 (2011)

    Article  Google Scholar 

  18. D’Silva, V., Kroening, D., Weissenbacher, G.: A survey of automated techniques for formal software verification. IEEE Trans. Comput.-Aided Des. Integr. Circ. Syst. (TCAD) 27(7), 1165–1178 (2008)

    Article  Google Scholar 

  19. Pasareanu, C.S., Visser, W.: A survey of new trends in symbolic execution for software testing and analysis. Int. J. Softw. Tools Technol. Transf. 11(4), 339–353 (2009)

    Article  Google Scholar 

  20. Qu, X., Robinson, B.: A case study of concolic testing tools and their limitations. In: International Symposium on Empirical Software Engineering and Measurement (ESEM), pp. 117–126, September 2011

    Google Scholar 

  21. Young, M., Taylor, R.N.: Combining static concurrency analysis with symbolic execution. IEEE Trans. Softw. Eng. 14(10), 1499–1511 (1988)

    Article  Google Scholar 

  22. Williams, N., Mouy, P., Roger, M., Marre, B.: PathCrawler: automatic generation of path tests by combining static and dynamic analysis. In: Dal Cin, M., Kaâniche, M., Pataricza, A. (eds.) EDCC 2005. LNCS, vol. 3463, pp. 281–292. Springer, Heidelberg (2005)

    Google Scholar 

  23. Nielson, F., Nielson, H.R., Hankin, C.L.: Principles of Program Analysis. Springer, Berlin (1999)

    Book  MATH  Google Scholar 

  24. Fehnker, A., Seefried, S., Huuck, R.: Counterexample guided path reduction for static program analysis. In: Dams, D., Hannemann, U., Steffen, M. (eds.) Concurrency, Compositionality, and Correctness. LNCS, vol. 5930, pp. 322–341. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  25. Schmidt, D.A., Steffen, B.: Program analysis as model checking of abstract interpretations. In: Levi, G. (ed.) SAS 1998. LNCS, vol. 1503, pp. 351–380. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  26. Fehnker, A., Huuck, R., Jayet, P., Lussenburg, M., Rauch, F.: Model checking software at compile time. In: Proceedings of the First Joint IEEE/IFIP Symposium on Theoretical Aspects of Software Engineering, TASE 2007, pp. 45–56. IEEE Computer Society, Washington, DC, USA (2007)

    Google Scholar 

  27. Cadar, C., Dunbar, D., Engler, D.: Klee: Unassisted and automatic generation of high-coverage tests for complex systems programs. In: Proceedings of the 8th USENIX Conference on Operating Systems Design and Implementation, OSDI 2008, pp. 209–224. USENIX Association, Berkeley, CA, USA (2008)

    Google Scholar 

  28. Burnim, J., Sen, K.: Heuristics for scalable dynamic test generation. In: Proceedings of the 2008 23rd IEEE/ACM International Conference on Automated Software Engineering. ASE 2008, pp. 443–446. IEEE Computer Society, Washington, DC, USA (2008)

    Google Scholar 

  29. Bradley, M., Cassez, F., Fehnker, A., Given-Wilson, T., Huuck, R.: High performance static analysis for industry. ENTCS, Third Workshop on Tools for Automatic Program Analysis (TAPAS 2012), vol. 289, pp. 3–14 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pablo Gonzalez-de-Aledo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Gonzalez-de-Aledo, P., Sanchez, P., Huuck, R. (2016). An Approach to Static-Dynamic Software Analysis. In: Artho, C., Ölveczky, P. (eds) Formal Techniques for Safety-Critical Systems. FTSCS 2015. Communications in Computer and Information Science, vol 596. Springer, Cham. https://doi.org/10.1007/978-3-319-29510-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-29510-7_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-29509-1

  • Online ISBN: 978-3-319-29510-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics