Skip to main content

Tangible Assets to Improve Research Quality: A Meta Analysis Case Study

  • Conference paper
  • First Online:
High Performance Computing Systems (WSCAD 2018)

Abstract

This paper presents a meta-analysis of the publications from all 18 previous editions of WSCAD in order to understand how performance results are validated and reported. This meta-analysis extract from these papers terms (keywords) belonging to three categories: statistics, metrics and tests. From all 426 papers analyzed, 93% referred at least one of the terms considered, indicating that there is a concern that results should be reported in order to the paper be considered relevant for this conference. Nevertheless, this analysis shows that only 3% of the papers applies reliable statistical tests to validate them. This paper depicts the meta-analysis achieved and proposes a direction to promote the adoption of a guideline to improve the results reporting in this conference and other with related subjects.

This paper was realized with support of the National Program of Academic Cooperation from CAPES/Brasil.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.software.com.br/p/qsr-nvivo.

  2. 2.

    http://www.equator-network.org/library/.

  3. 3.

    Preferred Reporting Items for Systematic Reviews and Meta-Analyzes.

  4. 4.

    Strengthening the Reporting of Observational Studies in Epidemiology.

  5. 5.

    Consolidated Standards of Reporting Trials.

  6. 6.

    Transparency and Openness Promotion.

  7. 7.

    http://link.springer.com.

  8. 8.

    http://ieeexplore.ieee.org.

  9. 9.

    http://portal.acm.org.

  10. 10.

    http://www.sciencedirect.com.

References

  1. Adler, S., Schmitt, S., Wolter, K., Kyas, M.: A survey of experimental evaluation in indoor localization research. In: 2015 International Conference on Indoor Positioning and Indoor Navigation (IPIN), pp. 1–10. IEEE (2015)

    Google Scholar 

  2. Bukh, P.N.D.: The Art of Computer Systems Performance Analysis, Techniques for Experimental Design, Measurement, Simulation and Modeling (1992)

    Google Scholar 

  3. Dean, A., Voss, D., Draguljić, D., et al.: Design and Analysis of Experiments, vol. 1. Springer, New York (1999). https://doi.org/10.1007/b97673

    Book  MATH  Google Scholar 

  4. Ebrahim, S., Clarke, M.: STROBE: new standards for reporting observational epidemiology, a chance to improve (2007)

    Google Scholar 

  5. Fortier, P., Michel, H.: Computer Systems Performance Evaluation and Prediction. Elsevier, Amsterdam (2003)

    Google Scholar 

  6. de França, B.B.N., Travassos, G.H.: Experimentation with dynamic simulation models in software engineering: planning and reporting guidelines. Empir. Softw. Eng. 21(3), 1302–1345 (2016)

    Article  Google Scholar 

  7. Jain, R.: The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. Wiley, Hoboken (1990)

    Google Scholar 

  8. Malta, M., Cardoso, L.O., Bastos, F.I., Magnanini, M.M.F., da Silva, C.M.F.P.: Iniciativa strobe: subsídios para a comunicação de estudos observacionais. Revista de Saúde Pública 44, 559–565 (2010)

    Article  Google Scholar 

  9. University of Minnesota: Types of statistical testes (2019). https://cyfar.org/types-statistical-tests. Accessed 23 May 2019

  10. Moher, D., et al.: Consort 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials. Int. J. Surg. 10(1), 28–55 (2012)

    Article  Google Scholar 

  11. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G.: Preferred reporting items for systematic reviews and meta-analyses: the prisma statement. Ann. Intern. Med. 151(4), 264–269 (2009)

    Article  Google Scholar 

  12. Montgomery, D.C.: Design and Analysis of Experiments. Wiley, Hoboken (2017)

    Google Scholar 

  13. Munafò, M.R., et al.: A manifesto for reproducible science. Nat. Hum. Behav. 1(1), 0021 (2017)

    Article  Google Scholar 

  14. Navidi, W.: Probabilidade e estatística para ciências exatas. AMGH (2012)

    Google Scholar 

  15. Neto, B.B., Scarminio, I.S., Bruns, R.E.: Como Fazer Experimentos-: Pesquisa eDesenvolvimento na Ciência e na Indústria. Bookman (2010)

    Google Scholar 

  16. Nosek, B.A., et al.: Promoting an open research culture. Science 348(6242), 1422–1425 (2015)

    Article  Google Scholar 

  17. Osorio, A., Dias, M., Cavalheiro, G.G.H.: WSCAD: uma meta-analise. In: WSCAD 2018, October 2018. http://wscad.sbc.org.br/2018/anais/anais-wscad-2018.pdf

  18. Prechelt, L.: A quantitative study of experimental evaluations of neural network learning algorithms: current research practice. IEEE Trans. Neural Netw. 6, 457–462 (1994)

    Google Scholar 

  19. Runeson, P., Höst, M.: Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14(2), 131 (2009)

    Article  Google Scholar 

  20. Stevens, A., et al.: Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ 348, g3804 (2014)

    Article  Google Scholar 

  21. Tedre, M., Moisseinen, N.: Experiments in computing: a survey. Sci. World J. 2014 (2014)

    Article  Google Scholar 

  22. Tichy, W.F., Lukowicz, P., Prechelt, L., Heinz, E.A.: Experimental evaluation in computer science: a quantitative study. J. Syst. Softw. 28(1), 9–18 (1995)

    Article  Google Scholar 

  23. Wainer, J., Barsottini, C.G.N., Lacerda, D., de Marco, L.R.M.: Empirical evaluation in computer science research published by ACM. Inf. Softw. Technol. 51(6), 1081–1085 (2009)

    Article  Google Scholar 

  24. Wainer, J., et al.: Métodos de pesquisa quantitativa e qualitativa para a ciência da computação. Atualização em informática 1, 221–262 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alessander Osorio .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Osorio, A., Dias, M., Cavalheiro, G.G.H. (2020). Tangible Assets to Improve Research Quality: A Meta Analysis Case Study. In: Bianchini, C., Osthoff, C., Souza, P., Ferreira, R. (eds) High Performance Computing Systems. WSCAD 2018. Communications in Computer and Information Science, vol 1171. Springer, Cham. https://doi.org/10.1007/978-3-030-41050-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41050-6_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41049-0

  • Online ISBN: 978-3-030-41050-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics