skip to main content
10.1145/2897676.2897678acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Experiences of testing bioinformatics programs for detecting subtle faults

Published:14 May 2016Publication History

ABSTRACT

One of the biggest challenges for conducting automated systematic testing on scientific programs is the oracle problem. This challenge is especially prevalent in the field of bioinformatics due to the inherent complexity of these programs. In this paper, we explore two approaches: pseudo-oracles and metamorphic testing for conducting automated systematic testing of bioinformatics programs. We use BBMap: an open source genome alignment tool as the system under test to evaluate the effectiveness of the above two approaches for identifying subtle faults.

Our results show that the pseudo-oracle testing does not provide a consistent basis for fault-detection. This is mainly due to not having a consistent rate of agreement between the outputs of the system under test and the pseudo-oracle. On the other hand, metamorphic testing detects a majority of subtle faults, suggesting that it is more effective in identifying these types of faults.

References

  1. S. Altschul, B. Demchak, R. Durbin, R. Gentleman, M. Krzywinski, H. Li, A. Nekrutenko, J. Robinson, W. Rasband, J. Taylor, and C. Trapnell. The anatomy of successful computational biology software. Nature Biotechnology, 31(10):894--897, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  2. E. Barr, M. Harman, P. McMinn, M. Shahbaz, and S. Yoo. The oracle problem in software testing: A survey. Software Engineering, IEEE Transactions on, 41(5):507--525, May 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. T. Y. Chen, T. H. Tse, and Z. Q. Zhou. Fault-based testing without the need of oracles. Information and Software Technology, 45(1):1--9, 2003.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. P. A. Fujita, B. Rhead, A. S. Zweig, A. S. Hinrichs, D. Karolchik, M. S. Cline, M. Goldman, G. P. Barber, H. Clawson, A. Coelho, et al. The ucsc genome browser database: update 2011. Nucleic acids research, page gkq963, 2010.Google ScholarGoogle Scholar
  5. E. Giannoulatou, S.-H. Park, D. T. Humphreys, and J. W. Ho. Verification and validation of bioinformatics software without a gold standard: a case study of bwa and bowtie. BMC bioinformatics, 15(Suppl 16):S15, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  6. Y. Jia and M. Harman. An analysis and survey of the development of mutation testing. IEEE Transactions on Software Engineering, 37:649--678, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. L. N. Joppa, G. McInerny, R. Harper, L. Salido, K. Takeda, K. O'Hara, D. Gavaghan, and S. Emmott. Troubling trends in scientific software use. Science, 340(6134):814--815, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  8. U. Kanewala. Techniques for automatic detection of metamorphic relations. In Software Testing, Verification and Validation Workshops (ICSTW), 2014 IEEE Seventh International Conference on, pages 237--238, March 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. U. Kanewala and J. M. Bieman. Techniques for testing scientific programs without an oracle. In Proc. SE-CSE, pages 48--57. IEEE, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. U. Kanewala and J. M. Bieman. Testing scientific software: A systematic literature review. Information and Software Technology, 56(10):1219--1232, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. U. Kanewala, J. M. Bieman, and A. Ben-Hur. Predicting metamorphic relations for testing scientific software: A machine learning approach using graph kernels. Software testing, verification and reliability, 2015. In press.Google ScholarGoogle Scholar
  12. W. J. Kent. BLAT--the BLAST-like alignment tool. Genome Research, 12(4):656--664, Apr. 2002.Google ScholarGoogle ScholarCross RefCross Ref
  13. J. Offut and Y. S. Ma. Description of method-level mutation operators for java. https://cs.gmu.edu/~offutt/mujava/mutopsMethod.pdf. Accessed: Dec, 2015.Google ScholarGoogle Scholar
  14. M. Papadakis and N. Malevris. An empirical evaluation of the first and second order mutation testing strategies. In Software Testing, Verification, and Validation Workshops (ICSTW), 2010 Third International Conference on, pages 90--99, April 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. R. Sanders and D. Kelly. The challenge of testing scientific software. In In Proc. of the Conference for the Association for Software Testing (CAST), pages 30--36, Toronto, July 2008.Google ScholarGoogle Scholar
  16. B. Smith and L. Williams. An empirical evaluation of the mujava mutation operators. In Testing: Academic and Industrial Conference Practice and Research Techniques - MUTATION, 2007. TAICPART-MUTATION 2007, pages 193--202, Sept 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. A. Sung, B. Choi, W. E. Wong, and V. Debroy. Mutant generation for embedded systems using kernel-based software and hardware fault simulation. Information and Software Technology, 53(10):1153--1164, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  18. M. Umarji, C. Seaman, A. Koru, and H. Liu. Software engineering education for bioinformatics. In Software Engineering Education and Training, 2009. CSEET '09. 22nd Conference on, pages 216--223, Feb 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. E. J. Weyuker. On testing non-testable programs. The Computer Journal, 25(4):465--470, 1982.Google ScholarGoogle ScholarCross RefCross Ref
  20. L. L. Wu and G. G. Kaiser. Constructing subtle concurrency bugs using synchronization-centric second-order mutation operators. Columbia University Academic Commons, 2011.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    SE4Science '16: Proceedings of the International Workshop on Software Engineering for Science
    May 2016
    41 pages
    ISBN:9781450341677
    DOI:10.1145/2897676

    Copyright © 2016 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 14 May 2016

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Upcoming Conference

    ICSE 2025

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader