skip to main content
10.1145/2349896.2349908acmotherconferencesArticle/Chapter ViewAbstractPublication PagesapsysConference Proceedingsconference-collections
research-article

MoonBox: debugging with online slicing and dryrun

Published:23 July 2012Publication History

ABSTRACT

Efficient tools are indispensable in the battle against software bugs. In this short paper, we introduce two techniques that target different phases of an interactive and iterative debugging session. To make slice-assisted log analysis practical to help fault diagnosis, slicing itself must be done instantaneously. We split the costly slicing computation into online and offline, and employ incremental updates after program edits. The result is a vast reduction of slicing cost. For the benchmarks we tested, slices can be computed in the range of seconds, which is 0.02%~6.5% of the unmodified slicing algorithm.

The possibility of running slicing in situ and with instant response time gives rise to the possibility of editing-time validation, which we call dryrun. The idea is that a pair of slices, one forward from root cause and one backward from the bug site, defines the scope to validate a fix. This localization makes it possible to invoke symbolic execution and constraint solving that are otherwise too expensive to use in an interactive debugging environment.

References

  1. H. Agrawal and J. R. Horgan. Dynamic program slicing. In PLDI, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. S. Chandra, S. J. Fink, and M. Sridharan. Snugglebug: a powerful approach to weakest preconditions. In PLDI, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. O. Crameri, R. Bianchini, and W. Zwaenepoel. Striking a new balance between program instrumentation and debugging time. In EuroSys, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. E. W. Dijkstra. A Discipline of Programming. Prentice Hall, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. E. Duesterwald, R. Gupta, and M. L. Soffa. A practical framework for demand-driven interprocedural data flow analysis. TOPLAS, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. Ferrante, K. J. Ottenstein, and J. D. Warren. The program dependence graph and its use in optimization. ACM Trans. Program. Lang. Syst., 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Z. Gu, E. T. Barr, D. J. Hamilton, and Z. Su. Has the bug really been fixed? In ICSE, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Horwitz, T. Reps, and D. Binkley. Interprocedural slicing using dependence graphs. TOPLAS, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. C. King. Symbolic execution and program testing. CACM, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. L. L. Pollock and M. L. Soffa. An incremental version of iterative data flow analysis. IEEE Trans. Softw. Eng., 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. W. N. Sumner, T. Bao, and X. Zhang. Selecting peers for execution comparison. In ISSTA, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Tucek, W. Xiong, and Y. Zhou. Efficient online validation with delta execution. In ASPLOS, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. Weiser. Program slicing. In ICSE, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Whaley and M. S. Lam. Cloning-based context-sensitive pointer alias analysis using binary decision diagrams. In PLDI '04, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Z. Yin, D. Yuan, Y. Zhou, S. Pasupathy, and L. N. Bairavasundaram. How do fixes become bugs? In SIGSOFT FSE, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. D. Yuan, H. Mai. W. Xiong, L. Tan, Y Zhou, and S. Pasupathy. Sherlog: error diagnosis by connecting clues from run-time logs. In ASPLOS, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. D. Yuan, S. Park, and Y Zhou. Characterising logging practices in open-source software. In SIGSOFT ICSE, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. D. Yuan, J. Zheng, S. Park, Y Zhou, and S. Savage. Improving software diagnosability via log enhancement. In ASPLOS, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C. Zamfir and G. Candea. Execution synthesis: a technique for automated software debugging. In EuroSys, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. C. Zhang, Z. Guo, M. Wu, L. Lu, Y. Fan, J. Zhao, and Z. Zhang. Autolog: facing log redundancy and insufficiency. In APSys, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. X. Zhang, S. Tallam, and R. Gupta. Dynamic slicing long running programs through execution fast forwarding. In SIGSOFT FSE, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    APSYS '12: Proceedings of the Asia-Pacific Workshop on Systems
    July 2012
    101 pages
    ISBN:9781450316699
    DOI:10.1145/2349896

    Copyright © 2012 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 23 July 2012

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Acceptance Rates

    Overall Acceptance Rate149of386submissions,39%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader