skip to main content
10.1145/2460999.2461011acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

A conceptual model to address threats to validity in controlled experiments

Published: 14 April 2013 Publication History

Abstract

Context: During the planning phase of an experiment, the threats to validity must be identified in order to assess their impact over the data. In addition, the actions to address these threats must be defined (if possible). Objective: This paper proposes a conceptual model which contains the relationships between threats to validity and actions to address them. Method: A Systematic Literature Review was conducted to collect data for building the conceptual model. We identified 166 papers published in nine journals and four conferences that reported threats to validity in controlled experiments. Results: We identified 39 threats to internal validity (95 actions to address them), 9 to external validity (23 actions to address them), 10 to construct validity (21 actions to address them), and 8 to conclusion validity (19 actions to address them). Conclusion: By presenting this conceptual model, we intend to assist novice researchers in identifying and addressing the threats to validity of empirical studies.

References

[1]
C. Andersson, T. Thelin, P. Runeson, and N. Dzamashvili, "An Experimental Evaluation of Inspection and Testing for Detection of Design Faults", International Symposium on Empirical Software Engineering, pp. 174--184, 2003.
[2]
E. Arisholm, H. Gallis, T. Dybå, D. I. K. Sjøberg, "Evaluating Pair Programming with Respect to System Complexity and Programmer Expertise", IEEE Transactions on Software Engineering 33 (2), pp. 65--86, 2007.
[3]
S. Biffl, and M. Halling, "Investigating the Defect Detection Effectiveness and Cost Benefit of Nominal Inspection Teams", IEEE Transactions on Software Engineering 29 (5), pp. 385--397, 2003.
[4]
J. Carver, J. VanVoorhis, and V. Basili, "Understanding the Impact of Assumptions on Experimental Validity", International Symposium on Empirical Software Engineering, pp. 251--260, 2004.
[5]
R. Feldt, and A. Magazinius, "Validity Threats in Empirical Software Engineering Research -- An Initial Survey", Software Engineering and Knowledge Engineering, pp. 374--379, 2010.
[6]
V. B. Kampenes, "Quality of Design, Analysis and Reporting of Software Engineering Experiments, A Systematic Review", Doctoral Thesis, University of Oslo, 2007.
[7]
L. Madeyski, "The impact of Test-First programming on branch coverage and mutation score indicator of unit tests - An experiment", Information and Software Technology 52, pp. 169--184, 2010.
[8]
J. C. Maldonado, J. Carver, F. Shull, S. Fabbri, E. Dória, L. Martimiano, M. Mendonça, and V. Basili, "Perspective-based Reading: A Replicated Experiment Focused on Individual Reviewer Effectiveness", Empirical Software Engineering 11, pp. 119--142, 2006.
[9]
W. R. Shadish, T. D. Cook, and D. T. Campbell, "Experimental and Quasi-Experimental Designs for Generalized Causal Inference", Houghton Mifflin, 2002.
[10]
F. Shull, D. Cruzes, V. Basili, and M. Mendonça, "Simulating Families of Studies to Build Confidence in Defect Hypotheses", Information and Software Technology 47, pp. 1019--1032, 2005.
[11]
D. I. K. Sjøberg, J. E. Hannay, O. Hansen, V. B. Kampenes, A. Karahasanovic, N. K. Liborg, and A. C. Rekdal, "A Survey of Controlled Experiments in Software Engineering", IEEE Transactions on Software Engineering 31 (9), pp. 733--753, 2005.
[12]
C. Wohlin, P. Runeson, M. Host, M. C. Ohlsson, B. Regnell, and A. Wessl, "Experimentation in Software Engineering: An Introduction", Kluwer Academic Publishers, 2000.

Cited By

View all
  • (2024)Anomaly Detection in Railway Sensor Data Environments: State-of-the-Art Methods and Empirical Performance EvaluationSensors10.3390/s2408263324:8(2633)Online publication date: 20-Apr-2024
  • (2024)Threats to Validity in Software Engineering – hypocritical paper section or essential analysis?Proceedings of the 18th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/3674805.3686691(314-324)Online publication date: 24-Oct-2024
  • (2023)How Do Computing Education Researchers Talk About Threats and Limitations?Proceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600114(381-396)Online publication date: 7-Aug-2023
  • Show More Cited By

Index Terms

  1. A conceptual model to address threats to validity in controlled experiments

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      EASE '13: Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering
      April 2013
      268 pages
      ISBN:9781450318488
      DOI:10.1145/2460999
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      • Centro de Informatica - UFPE: Centro de Informatica - UFPE
      • SBC: Brazilian Computer Society
      • CNPq: Conselho Nacional de Desenvolvimento Cientifico e Tecn
      • CAPES: Brazilian Higher Education Funding Council

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 April 2013

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. controlled experiment
      2. empirical study
      3. systematic literature review
      4. threats to validity

      Qualifiers

      • Research-article

      Conference

      EASE '13
      Sponsor:
      • Centro de Informatica - UFPE
      • SBC
      • CNPq
      • CAPES

      Acceptance Rates

      EASE '13 Paper Acceptance Rate 31 of 94 submissions, 33%;
      Overall Acceptance Rate 71 of 232 submissions, 31%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)27
      • Downloads (Last 6 weeks)5
      Reflects downloads up to 08 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Anomaly Detection in Railway Sensor Data Environments: State-of-the-Art Methods and Empirical Performance EvaluationSensors10.3390/s2408263324:8(2633)Online publication date: 20-Apr-2024
      • (2024)Threats to Validity in Software Engineering – hypocritical paper section or essential analysis?Proceedings of the 18th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/3674805.3686691(314-324)Online publication date: 24-Oct-2024
      • (2023)How Do Computing Education Researchers Talk About Threats and Limitations?Proceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600114(381-396)Online publication date: 7-Aug-2023
      • (2023)Construct Validity in Software EngineeringIEEE Transactions on Software Engineering10.1109/TSE.2022.317672549:3(1374-1396)Online publication date: 1-Mar-2023
      • (2022)Tisane: Authoring Statistical Models via Formal Reasoning from Conceptual and Data RelationshipsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501888(1-16)Online publication date: 29-Apr-2022
      • (2018)Threats to validity in controlled experiments in software engineeringProceedings of the XXXII Brazilian Symposium on Software Engineering10.1145/3266237.3266264(52-61)Online publication date: 17-Sep-2018
      • (2018)Improving the Quality of Controlled Experiments in Software EngineeringACM SIGSOFT Software Engineering Notes10.1145/3178315.317832143:1(1-6)Online publication date: 28-Mar-2018
      • (2018)Threats to validity in search‐based predictive modelling for software engineeringIET Software10.1049/iet-sen.2018.514312:4(293-305)Online publication date: Aug-2018
      • (2018)Empirical study on software process variability modelling with SMartySPEM and vSPEMIET Software10.1049/iet-sen.2017.006112:6(536-546)Online publication date: Dec-2018
      • (2016)Common threats to software quality predictive modeling studies using search-based techniques2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI)10.1109/ICACCI.2016.7732104(554-560)Online publication date: Sep-2016
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media