Reproduction Package for STTT Submission `Six Years Later: Testing vs. Model Checking'
Description
Paper Abstract: Five years ago, we performed the first large-scale comparison of automated test generators and software model checkers with respect to bug-finding capabilities on a benchmark set with 5,693 C programs. Since then, the International Competition on Software Testing (Test-Comp) has established standardized formats and community-agreed rules for the experimental comparison of test generators. With this new context, it is time to revisit our initial question: Model checkers or test generators—which tools are more effective in finding bugs in software? To answer this, we perform a comparative analysis on the tools and existing data published by two competitions, the International Competition on Software Verification (SV-COMP) and Test-Comp. The results provide two insights: (1) Almost all test generators that participate in Test-Comp use hybrid approaches that include formal methods, and (2) while model checkers are still highly competitive, the test generators’ bug-finding capabilities now outperform them.
Hardware Requirements
This artifact requires at least 16 GB of RAM, 4 CPU cores, and 50 GB of disk space.
Usage
Start the VM, open the terminal and switch into directory ~/Test-Study
. The README.md in this directory contains all further instructions.
Note: This VM already has all requirements installed.
VM Username: study
VM Password: study
Files
2023-Test-Study-Artifact.zip
Files
(18.0 GB)
Name | Size | Download all |
---|---|---|
md5:d123beb2288cb0d17db718388517d17c
|
18.0 GB | Preview Download |