skip to main content
10.1145/2601248.2601288acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

A controlled experiment to evaluate the effectiveness and the efficiency of four static program analysis tools for Java programs

Authors Info & Claims
Published:13 May 2014Publication History

ABSTRACT

This paper presents the results of an experimental study, in which four open source static program analysis tools namely, FindBugs, CodePro Analytix, UCDetector, and PMD have been applied on four small Java projects to appraise the bug detection effectiveness and efficiency on mutation bugs. In this experiment, we generated multiple applicable mutants of each projects using MuJava tool and subsequently applied the four SPA tools to measure the bug detecting effectiveness and efficiency. The obtained data was analyzed on two different bug classifications, namely mutant based and severity based. Our results showed that PMD demonstrated the maximum bug detecting effectiveness as well as efficiency among the four SPA tools, whereas CodePro Analytix identified bugs at all category of severity including most of the high severity bugs.

References

  1. Findbugs. http://findbugs.sourceforge.net/. Accessed: 2013-12-20.Google ScholarGoogle Scholar
  2. Mujava. http://cs.gmu.edu/~offutt/mujava/. Accessed: 2013-12-20.Google ScholarGoogle Scholar
  3. Bus. http://1000projects.org/bus-scheduling-and-booking-system-cse-java-project-withcode.html.Google ScholarGoogle Scholar
  4. Car. http://projectabstracts.com/1798/simple-car-sales-system-in-java.html.Google ScholarGoogle Scholar
  5. CodePro. http://wiki.eclipse.org/imgaes/7/75/CodeProDatasheet.pdf/.Google ScholarGoogle Scholar
  6. Hotel. http://sourceforge.net/p/hotelmgmtsys/code/HEAD/tree/.Google ScholarGoogle Scholar
  7. Y.-S. Ma, J. Offutt, and Y.-R. Kwon. Mujava: A mutation system for java. In Proceedings of the 28th International Conference on Software Engineering, ICSE '06, pages 827--830, New York, NY, USA, 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. N. Meng, Q. Wang, Q. Wu, and H. Mei. An approach to merge results of multiple static analysis tools (short paper). In Quality Software, 2008. QSIC'08. The Eighth International Conference on, pages 169--174. IEEE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Monopoly. https://code.google.com/p/cosc603rajendranmonopoly/source/browse/trunk/+cosc603rajendranmonopoly/Monopoly/?r=9.Google ScholarGoogle Scholar
  10. Pmd. http://pmd.sourceforge.net/.Google ScholarGoogle Scholar
  11. N. Rutar, C. B. Almazan, and J. S. Foster. A comparison of bug finding tools for java. In Software Reliability Engineering, 2004. ISSRE 2004. 15th International Symposium on, pages 245--256. IEEE, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. UCDetector. http://www.ucdetector.org/.Google ScholarGoogle Scholar
  13. S. Wagner, J. Jürjens, C. Koller, and P. Trischberger. Comparing bug finding tools with reviews and tests. In Proceedings of the 17th IFIP TC6/WG 6.1 International Conference on Testing of Communicating Systems, TestCom'05, pages 40--55, Berlin, Heidelberg, 2005. Springer-Verlag. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. M. S. Ware and C. J. Fox. Securing java code: heuristics and an evaluation of static analysis tools. In Proceedings of the 2008 workshop on Static analysis, pages 12--21. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A controlled experiment to evaluate the effectiveness and the efficiency of four static program analysis tools for Java programs

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Other conferences
              EASE '14: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering
              May 2014
              486 pages
              ISBN:9781450324762
              DOI:10.1145/2601248

              Copyright © 2014 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 13 May 2014

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate71of232submissions,31%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader