skip to main content
10.1145/2460999.2461009acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

An externally replicated experiment to evaluate software testing methods

Published: 14 April 2013 Publication History

Abstract

Context: Many empirical studies have been carried out to evaluate software testing methods in the past decades. However, we are still not able to generalize the results, as most studies are not complete and differ significantly from one another. Objective: To contribute to the existing knowledge base of testing techniques by evaluating three software testing methods a) code reading by stepwise abstraction, b) functional testing using equivalence partitioning and boundary value analysis, and c) structural testing using 100% branch, multiple-condition, loop, and relational-operator coverage using a well-defined and standard schema. Method: A controlled experiment was carried out with eighteen subjects who applied three techniques to three C programs in a fractional factorial experimental design. Results: There is no difference in techniques in terms of failure observation and fault finding effectiveness and the effectiveness depends on the program rather than technique. In case of failure observation efficiency, all the techniques performed equally; however, in terms of fault isolation efficiency, code reading performed better than structural testing which in turn out performed functional testing. With respect to the fault types, all techniques performed equally except in case of cosmetic faults where functional testing performed better than other two testing methods. Conclusion: We conclude that all techniques are all equivalent in terms of effectiveness. However, the techniques differ partially in terms of efficiency. The effect of the program was significant in almost all cases. We need to build some standardized and better laboratory packages which should represent actual software engineering practices. Carry out experiments on such packages will help in deriving realistic results.

References

[1]
Basili, V., Selby Jr, R., and Hutchens, D. Experimentation in software engineering. Technical report, DTIC Document, 1985.
[2]
Rombach, H., Basili, V., and Selby, R. Experimental software engineering issues: critical assessment and future directions: international workshop, Dagstuhl Castle, Germany, September 14--18, 1992: proceedings, volume 706. Springer, 1993.
[3]
Basili, V., "The Role of Controlled Experiments in Software Engineering Research", Empirical Software Engineering Issues. Critical Assessment and Future Directions, Lecture Notes in Computer Science, Volume 4336,pp 33--37, 2007
[4]
Briand, L. C., "A Critical Analysis of Empirical Research in Software Testing", First International Symposium on Empirical Software Engineering and Measurement, 2007. ESEM 2007.
[5]
Juristo, N., Moreno, A. M., and Vegas, S. Reviewing 25 Years of Testing Technique Experiments, Empirical Softw. Eng. J., 9, 1/2 (March 2004), 7--44.
[6]
Moreno, A. M., Shull, F., Juristo, N., and Vegas, S. A look at 25 years of data. IEEE Software, 26(1):15--17, 2009.
[7]
F. Shull et al. (eds.), Guide to Advanced Empirical Software Engineering. © Springer 2008
[8]
Lung, J., "On the difficulty of replicating human subjects studies in software engineering", ACM/IEEE 30th International Conference on Software Engineering, 2008. ICSE '08.
[9]
Lott, C. and Rombach, H. Repeatable software engineering experiments for comparing defect-detection techniques. Empirical Software Engineering, 1(3):241--277, 1996.
[10]
Carver, J. C. Towards reporting guidelines for experimental replications: A proposal. In International Workshop on Replication in Empirical Software Engineering Research, Cape Town, South Africa, 2010
[11]
Hetzel, W. An experimental analysis of program verification methods. 1976.
[12]
Myers, G. A controlled experiment in program testing and code walkthroughs/inspections. Communications of the ACM, 21(9):760--768, 1978.
[13]
Basili, V. and Selby, R. Comparing the effectiveness of software testing strategies. Software Engineering, IEEE Transactions on, (12):1278--1296, 1987.
[14]
Kamsties, E. and Lott, C. An empirical evaluation of three defect-detection techniques. Software Engineering ESEC'95, pages 362--383, 1995.
[15]
Roper, M., Wood, M., and Miller, J. An empirical evaluation of defect detection techniques. Information and Software Technology, 39(11):763--775, 1997.
[16]
Juristo, N. and Vegas, S. Functional testing, structural testing and code reading: what fault type do they each detect? Empirical Methods and Studies in Software Engineering, pages 208--232, 2003.
[17]
Juristo, N., et al. "Comparing the Effectiveness of Equivalence Partitioning, Branch Testing and Code Reading by Stepwise Abstraction Applied by Subjects." Software Testing, Verification and Validation (ICST), 2012 IEEE Fifth International Conference on. IEEE, 2012.

Cited By

View all
  • (2020)Replication of Studies in Empirical Software Engineering: A Systematic Mapping Study, From 2013 to 2018IEEE Access10.1109/ACCESS.2019.29521918(26773-26791)Online publication date: 2020
  • (2018)Integrated Test DevelopmentProceedings of the 7th Computer Science Education Research Conference10.1145/3289406.3289408(9-20)Online publication date: 10-Oct-2018
  • (2016)Impact of CS programs on the quality of test cases generationProceedings of the 38th International Conference on Software Engineering Companion10.1145/2889160.2889190(374-383)Online publication date: 14-May-2016

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '13: Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering
April 2013
268 pages
ISBN:9781450318488
DOI:10.1145/2460999
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • Centro de Informatica - UFPE: Centro de Informatica - UFPE
  • SBC: Brazilian Computer Society
  • CNPq: Conselho Nacional de Desenvolvimento Cientifico e Tecn
  • CAPES: Brazilian Higher Education Funding Council

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 April 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. comparison of testing methods
  2. evaluation
  3. experiment
  4. replication

Qualifiers

  • Research-article

Conference

EASE '13
Sponsor:
  • Centro de Informatica - UFPE
  • SBC
  • CNPq
  • CAPES

Acceptance Rates

EASE '13 Paper Acceptance Rate 31 of 94 submissions, 33%;
Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)1
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Replication of Studies in Empirical Software Engineering: A Systematic Mapping Study, From 2013 to 2018IEEE Access10.1109/ACCESS.2019.29521918(26773-26791)Online publication date: 2020
  • (2018)Integrated Test DevelopmentProceedings of the 7th Computer Science Education Research Conference10.1145/3289406.3289408(9-20)Online publication date: 10-Oct-2018
  • (2016)Impact of CS programs on the quality of test cases generationProceedings of the 38th International Conference on Software Engineering Companion10.1145/2889160.2889190(374-383)Online publication date: 14-May-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media