Abstract
Over the last several years, tools for program analysis and verification have became much more mature. There are now a number of competitions that evaluate and compare the implemented analyses for a given set of benchmarks. The comparison of the analyses either focuses on the analysis results themselves (verification of specified properties) or on the impact on a client analysis. This track is concerned with methods of evaluation for comparing analysis and verification techniques and how verified program properties can be represented such that they remain reproducible and reusable as intermediate results in the overall verification process (i.e., for other verification tools or verification steps).
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Beyer, D.: Software verification with validation of results (Report on SV-COMP 2017). In: Proc. TACAS. LNCS, vol. 10206, pp. 331–349. Springer (2017). https://doi.org/10.1007/978-3-662-54580-5_20
Beyer, D., Keremoglu, M.E.: CPAchecker: A tool for configurable software verification. In: Proc. CAV. LNCS, vol. 6806, pp. 184–190. Springer (2011). https://doi.org/10.1007/978-3-642-22110-1_16
Beyer, D., Dangl, M.: Strategy selection for software verification based on Boolean features: A simple but effective approach. In: Proc. ISoLA. LNCS, vol. 11245, pp. 144–159. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_11
Beyer, D., Friedberger, K.: In-place vs. copy-on-write CEGAR refinement for block summarization with caching. In: Proc. ISoLA. LNCS, vol. 11245, pp. 197–215. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_14
Efremov, D., Mandrykin, M., Khoroshilov, A.: Deductive verification of unmodified Linux kernel library functions. In: Proc. ISoLA. LNCS, vol. 11245, pp. 216–234. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_15
Filliâtre, J.C., Paskevich, A.: Why3: Where programs meet provers. In: Proc ESOP. LNCS, vol. 7792, pp. 125–128. Springer, Berlin (2013). https://doi.org/10.1007/978-3-642-37036-6_8
Howar, F., Isberner, M., Merten, M., Steffen, B., Beyer, D., Păsăreanu, C.S.: Rigorous examination of reactive systems. Int. J. Softw. Tools Technol. Transf. 16(5), 457–464 (2014). https://doi.org/10.1007/s10009-014-0337-y
Huisman, M., Klebanov, V., Monahan, R., Tautschnig, M.: VerifyThis 2015. Int. J. Softw. Tools Technol. Transf. 19(6), 763–771 (2017). https://doi.org/10.1007/s10009-016-0438-x
Huisman, M., Monahan, R., Müller, P., Mostowski, W., Ulbrich, M.: VerifyThis 2017: A Program Verification Competition. Technical report, Karlsruhe Reports in Informatics, number 2017-10, Karlsruhe Institute of Technology, Faculty of Informatics (2017). https://www.ethz.ch/content/dam/ethz/special-interest/infk/chair-program-method/pm/documents/Verify%20This/Solutions%202017/CompetitionReportVerifyThis2017.pdf
Jasper, M., Fecke, M., Steffen, B., Schordan, M., Meijer, J., Pol, J.v.d., Howar, F., Siegel, S.F.: The RERS 2017 challenge and workshop (invited paper). In: Proc. SPIN, pp. 11–20. ACM, New York (2017). https://doi.org/10.1145/3092282.3098206
Jasper, M., Steffen, B.: Synthesizing subtle bugs with known witnesses. In: Proc. ISoLA. LNCS, vol. 11245, pp. 235–257. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_16
Liao, C., Lin, P.H., Asplund, J., Schordan, M., Karlin, I.: DataRaceBench: A benchmark suite for systematic evaluation of data race detection tools. In: Proc. SC, pp. 11:1–11:14. ACM, New York (2017). https://doi.org/10.1145/3126908.3126958
Lin, P.H., Schordan, M., Liao, C., Karlin, I.: Runtime and memory evaluation of data race detection tools. In: Proc. ISoLA. LNCS, vol. 11245, pp. 179–196. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_13
Luo, Z., Siegel, S.F.: Symbolic execution and deductive verification approaches to VerifyThis 2017 challenges. In: Proc. ISoLA. LNCS, vol. 11245, pp. 160–178. Springer (2018). https://doi.org/10.1007/978-3-030-03421-4_12
Siegel, S.F., Zheng, M., Luo, Z., Zirkel, T.K., Marianiello, A.V., Edenhofner, J.G., Dwyer, M.B., Rogers, M.S.: CIVL: The Concurrency Intermediate Verification Language. In: Proc. SC, pp. 61:1–61:12. ACM, New York (2015). https://doi.org/10.1145/2807591.2807635
Acknowledgments
This work was partially performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, Lawrence Livermore National Security, LLC. IM release number LLNL-CONF-757485.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Schordan, M., Beyer, D., Siegel, S.F. (2018). Evaluating Tools for Software Verification (Track Introduction). In: Margaria, T., Steffen, B. (eds) Leveraging Applications of Formal Methods, Verification and Validation. Verification. ISoLA 2018. Lecture Notes in Computer Science(), vol 11245. Springer, Cham. https://doi.org/10.1007/978-3-030-03421-4_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-03421-4_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-03420-7
Online ISBN: 978-3-030-03421-4
eBook Packages: Computer ScienceComputer Science (R0)