skip to main content
research-article
Public Access

DiaPro: Unifying Dynamic Impact Analyses for Improved and Variable Cost-Effectiveness

Published:06 April 2016Publication History
Skip Abstract Section

Abstract

Impact analysis not only assists developers with change planning and management, but also facilitates a range of other client analyses, such as testing and debugging. In particular, for developers working in the context of specific program executions, dynamic impact analysis is usually more desirable than static approaches, as it produces more manageable and relevant results with respect to those concrete executions. However, existing techniques for this analysis mostly lie on two extremes: either fast, but too imprecise, or more precise, yet overly expensive. In practice, both more cost-effective techniques and variable cost-effectiveness trade-offs are in demand to fit a variety of usage scenarios and budgets of impact analysis.

This article aims to fill the gap between these two extremes with an array of cost-effective analyses and, more broadly, to explore the cost and effectiveness dimensions in the design space of impact analysis. We present the development and evaluation of DiaPro, a framework that unifies a series of impact analyses, including three new hybrid techniques that combine static and dynamic analyses. Harnessing both static dependencies and multiple forms of dynamic data including method-execution events, statement coverage, and dynamic points-to sets, DiaPro prunes false-positive impacts with varying strength for variant effectiveness and overheads. The framework also facilitates an in-depth examination of the effects of various program information on the cost-effectiveness of impact analysis.

We applied DiaPro to ten Java applications in diverse scales and domains, evaluating it thoroughly on both arbitrary and repository-based queries from those applications. We show that the three new analyses are all significantly more effective than existing alternatives while remaining efficient, and the DiaPro framework, as a whole, provides flexible cost-effectiveness choices for impact analysis with the best options for variable needs and budgets. Our study results also suggest that hybrid techniques tend to be much more cost-effective than purely dynamic approaches, in general, and that statement coverage has mostly stronger effects than dynamic points-to sets on the cost-effectiveness of dynamic impact analysis, while static dependencies have even stronger effects than both forms of dynamic data.

References

  1. Mithun Acharya and Brian Robinson. 2011. Practical change impact analysis based on static program slicing for industrial software systems. In Proceedings of IEEE/ACM International Conference on Software Engineering. 746--765. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Hiralal Agrawal, Joseph R. Horgan, Edward W. Krauser, and Saul London. 1993. Incremental regression testing. In Proceedings of IEEE International Conference on Software Maintenance. 348--357. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Alfred V. Aho, Monica Lam, Ravi Sethi, and Jeffrey D. Ullman. 2006. Compilers: Principles, Techniques and Tools. Addison-Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. M. Ajrnal Chaumun, Hind Kabaili, Rudolf K. Keller, and François Lustman. 1999. A change impact model for changeability assessment in object-oriented software systems. In Proceedings of European Conference on Software Maintainance and Reengineering. 130--138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Taweesup Apiwattanapong, Alessandro Orso, and Mary Jean Harrold. 2005. Efficient and precise dynamic impact analysis using execute-after sequences. In Proceedings of IEEE/ACM International Conference on Software Engineering. 432--441. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Linda Badri, Mourad Badri, and Daniel St-Yves. 2005. Supporting predictive change impact analysis: A control call graph based technique. In Proceedings of Asia-Pacific Software Engineering Conference. 167--175. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Shawn A. Bohner and Robert S. Arnold. 1996. An Introduction to Software Change Impact Analysis. In Software Change Impact Analysis, Shawn A. Bohner and Robert S. Arnold (Eds.). IEEE Computer Society Press, Washington, DC, pp. 1--26.Google ScholarGoogle Scholar
  8. B. Breech, A. Danalis, Stacey Shindo, and Lori Pollock. 2004. Online impact analysis via dynamic compilation technology. In Proceedings of IEEE International Conference on Software Maintenance. 453--457. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Breech, M. Tegtmeyer, and L. Pollock. 2005. A comparison of online and dynamic impact analysis algorithms. In Proceedings of European Conference on Software Maintenance and Reengineering. 143--152. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. B. Breech, M. Tegtmeyer, and L. Pollock. 2006. Integrating influence mechanisms into impact analysis for increased precision. In Proceedings of IEEE International Conference on Software Maintenance. 55--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jonathan Buckner, Joseph Buchta, Maksym Petrenko, and Vaclav Rajlich. 2005. JRipples: A tool for program comprehension during incremental change. In Proceedings of IEEE International Workshop on Program Comprehension. 149--152. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Haipeng Cai and Raul Santelices. 2014. Diver: Precise dynamic impact analysis using dependence-based trace pruning. In Proceedings of IEEE/ACM International Conference on Automated Software Engineering. 343--348. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Haipeng Cai and Raul Santelices. 2015a. Abstracting program dependencies using the method dependence graph. In Proceedings of IEEE International Conference on Software Quality, Reliability, and Security. 49--58. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Haipeng Cai and Raul Santelices. 2015b. A comprehensive study of the predictive accuracy of dynamic change-impact analysis. Journal of Systems and Software 103, 248--265. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Haipeng Cai and Raul Santelices. 2015c. A framework for cost-effective dependence-based dynamic impact analysis. In Proceedings of IEEE International Conference on Software Analysis, Evolution, and Reengineering. 231--240.Google ScholarGoogle ScholarCross RefCross Ref
  16. Haipeng Cai and Raul Santelices. 2015d. TracerJD: Generic trace-based dynamic dependence analysis with fine-grained logging. In Proceedings of IEEE International Conference on Software Analysis, Evolution, and Reengineering. 489--493.Google ScholarGoogle ScholarCross RefCross Ref
  17. Haipeng Cai, Raul Santelices, and Tianyu Xu. 2014. Estimating the accuracy of dynamic change-impact analysis using sensitivity analysis. In Proceedings of IEEE International Conference on Software Security and Reliability. 48--57. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Gerardo Canfora and Luigi Cerulo. 2005. Impact analysis by mining software and change request repositories. In Proceedings of IEEE International Symposium on Software Metrics. 20--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Norman Cliff. 1996. Ordinal Methods for Behavioral Data Analysis. Psychology Press.Google ScholarGoogle Scholar
  20. Cleidson R. B. de Souza and David F. Redmiles. 2008. An empirical study of software developers’ management of dependencies and changes. In Proceedings of IEEE/ACM International Conference on Software Engineering. 241--250. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Hyunsook Do, Sebastian Elbaum, and Gregg Rothermel. 2005. Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Empirical Software Engineering 10, 4, 405--435. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Sebastian Elbaum, Alexey G. Malishevsky, and Gregg Rothermel. 2002. Test case prioritization: A family of empirical studies. IEEE Transactions on Software Engineering 28, 2, 159--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. J. Ferrante, K. J. Ottenstein, and J. D. Warren. 1987. The program dependence graph and its use in optimization. ACM Transactions on Programming Languages and Systems 9, 3, 319--349. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Malcom Gethers, Bogdan Dit, Huzefa Kagdi, and Denys Poshyvanyk. 2012. Integrated impact analysis for managing software changes. In Proceedings of IEEE/ACM International Conference on Software Engineering. 430--440. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Ahmed E. Hassan and Richard C. Holt. 2006. Replaying development history to assess the effectiveness of change propagation tools. Empirical Software Engineering 11, 3, 335--367. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. L. Hattori, D. Guerrero, J. Figueiredo, J. Brunet, and J. Damasio. 2008. On the precision and accuracy of impact analysis techniques. In International Conference on Computer and Information Science. 513--518. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Sture Holm. 1979. A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics 65--70.Google ScholarGoogle Scholar
  28. Susan Horwitz, Thomas Reps, and David Binkley. 1990. Interprocedural slicing using dependence graphs. ACM Transactions on Programming Languages and Systems 12, 1, 26--60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Lulu Huang and Yeong-Tae Song. 2007. Precise dynamic impact analysis with dependency analysis for object-oriented programs. In Proceedings of International Conference on Software Engineering Research, Management and Applications. 374--384. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Daniel Jackson and Martin Rinard. 2000. Software analysis: A roadmap. In Proceedings of the Conference on the Future of Software Engineering. 133--145. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Judit Jász, Árpád Beszédes, Tibor Gyimóthy, and Václav Rajlich. 2008. Static execute after/before as a replacement of traditional software dependencies. In Proceedings of IEEE International Conference on Software Maintenance. 137--146.Google ScholarGoogle ScholarCross RefCross Ref
  32. Huzefa Kagdi, Malcom Gethers, Denys Poshyvanyk, and Michael L. Collard. 2010. Blending conceptual and evolutionary couplings to support change impact analysis in source code. In Proceedings of IEEE Working Conference on Reverse Engineering. 119--128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Mariam Kamkar. 1995. An overview and comparative classification of program slicing techniques. Journal of Systems and Software 31, 3, 197--214. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Bogdan Korel. 1990. Automated software test data generation. IEEE Transactions on Software Engineering 16, 8, 870--879. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Patrick Lam, Eric Bodden, Ondrej Lhoták, and Laurie Hendren. 2011. Soot - a Java bytecode optimization framework. In Proceedings of Cetus Users and Compiler Infrastructure Workshop. 1--11.Google ScholarGoogle Scholar
  36. James Law and Gregg Rothermel. 2003a. Incremental dynamic impact analysis for evolving software systems. In Proceedings of IEEE International Symposium on Software Reliability Engineering. 430--441. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. James Law and Gregg Rothermel. 2003b. Whole program path-based dynamic impact analysis. In Proceedings of IEEE/ACM International Conference on Software Engineering. 308--318. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Bixin Li, Xiaobing Sun, and Jacky Keung. 2013a. FCA--CIA: An approach of using FCA to support cross-level change impact analysis for object oriented Java programs. Information and Software Technology 55, 8, 1437--1449. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Bixin Li, Xiaobing Sun, Hareton Leung, and Sai Zhang. 2013b. A survey of code-based change impact analysis techniques. Software Testing, Verification and Reliability 23, 8, 613--646.Google ScholarGoogle ScholarCross RefCross Ref
  40. Li Li and A. Jefferson Offutt. 1996. Algorithmic analysis of the impact of changes to object-oriented software. In Proceedings of IEEE International Conference on Software Maintenance. 171--184. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Joseph P. Loyall and Susan A. Mathisen. 1993. Using dependence analysis to support the software maintenance process. In Proceedings of IEEE International Conference on Software Maintenance. 282--291. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Mirna Carelli Oliveira Maia, Roberto Almeida Bittencourt, Jorge Cesar Abrantes de Figueiredo, and Dalton Dario Serey Guerrero. 2010. The hybrid technique for object-oriented software change impact analysis. In Proceedings of European Conference on Software Maintenance and Reengineering. 252--255. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Markus Mock, Darren C. Atkinson, Craig Chambers, and Susan J. Eggers. 2005. Program slicing with dynamic points-to sets. IEEE Transactions on Software Engineering 31, 8, 657--678. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Frederick Mosteller and R. A. Fisher. 1948. Questions and answers. The American Statistician 2, 5, 30--31.Google ScholarGoogle ScholarCross RefCross Ref
  45. Eugene M. Myers. 1981. A precise inter-procedural data flow algorithm. In Proceedings of ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages. 219--230. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Alessandro Orso, Taweesup Apiwattanapong, and Mary Jean Harrold. 2003. Leveraging field data for impact analysis and regression testing. In Proceedings of ACM International Symposium on the Foundations of Software Engineering. 128--137. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Alessandro Orso, Taweesup Apiwattanapong, James B. Law, Gregg Rothermel, and Mary Jean Harrold. 2004. An empirical comparison of dynamic impact analysis algorithms. In Proceedings of IEEE/ACM International Conference on Software Engineering. 491--500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Maksym Petrenko and Václav Rajlich. 2009. Variable granularity for improving precision of impact analysis. In Proceedings of IEEE International Conference on Program Comprehension. 10--19.Google ScholarGoogle ScholarCross RefCross Ref
  49. Denys Poshyvanyk, Andrian Marcus, Rudolf Ferenc, and Tibor Gyimóthy. 2009. Using information retrieval based coupling measures for impact analysis. Empirical Software Engineering 14, 1, 5--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Václav Rajlich. 2014. Software evolution and maintenance. In Proceedings of the Conference on Future of Software Engineering. 133--144. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Xiaoxia Ren, Fenil Shah, Frank Tip, Barbara G. Ryder, and Ophelia Chesley. 2004. Chianti: A tool for change impact analysis of Java programs. In Proceedings of ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications. 432--448. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Per Rovegard, Lefteris Angelis, and Claes Wohlin. 2008. An empirical study on views of importance of change impact analysis issues. IEEE Transactions on Software Engineering 34, 4, 516--530. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Barbara G. Ryder. 2003. Dimensions of precision in reference analysis of object-oriented programming languages. In Compiler Construction. 126--137. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Raul Santelices, Yiji Zhang, Haipeng Cai, and Siyuan Jiang. 2013. DUA-Forensics: A fine-grained dependence analysis and instrumentation framework based on Soot. In Proceedings of ACM SIGPLAN International Workshop on the State of the Art in Java Program Analysis. 13--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Lajos Schrettner, Judit Jász, Tamás Gergely, Árpád Beszédes, and Tibor Gyimóthy. 2013. Impact analysis in the presence of dependence clusters using static execute after in WebKit. Journal of Software: Evolution and Process 26, 6, 569--588. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Xiaobing Sun, Bixin Li, Chuanqi Tao, Wanzhi Wen, and Sai Zhang. 2010. Change impact analysis based on a taxonomy of change types. In Proceedings of IEEE Computer Software and Applications Conference. 373--382. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Yida Tao, Yingnong Dang, Tao Xie, Dongmei Zhang, and Sunghun Kim. 2012. How do software engineers understand code changes? An exploratory study in industry. In Proceedings of ACM International Symposium on the Foundations of Software Engineering. 51:1--51:11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Paolo Tonella. 2003. Using a concept lattice of decomposition slices for program understanding and impact analysis. IEEE Transactions on Software Engineering 29, 6, 495--509. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. G. A. Venkatesh. 1991. The semantic approach to program slicing. In Proceedings of ACM Conference on Programming Language Design and Implementation. 107--119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. Ronald E. Walpole, Raymond H. Myers, Sharon L. Myers, and Keying E. Ye. 2011. Probability and Statistics for Engineers and Scientists. Prentice Hall, Upper Saddle River, NJ.Google ScholarGoogle Scholar
  61. M. D. Weiser, John D. Gannon, and Paul R. McMullin. 1985. Comparison of structural test coverage metrics. IEEE Software 2, 2, 80--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Xiangyu Zhang and Rajiv Gupta. 2004. Cost effective dynamic program slicing. In Proceedings of ACM Conference on Programming Language Design and Implementation. 94--106. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. Thomas Zimmermann, Andreas Zeller, Peter Weissgerber, and Stephan Diehl. 2005. Mining version histories to guide software changes. IEEE Transactions on Software Engineering 31, 6, 429--445. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. DiaPro: Unifying Dynamic Impact Analyses for Improved and Variable Cost-Effectiveness

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Software Engineering and Methodology
        ACM Transactions on Software Engineering and Methodology  Volume 25, Issue 2
        May 2016
        328 pages
        ISSN:1049-331X
        EISSN:1557-7392
        DOI:10.1145/2913009
        Issue’s Table of Contents

        Copyright © 2016 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 6 April 2016
        • Revised: 1 February 2016
        • Accepted: 1 February 2016
        • Received: 1 June 2015
        Published in tosem Volume 25, Issue 2

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader