skip to main content
10.1145/2362536.2362563acmotherconferencesArticle/Chapter ViewAbstractPublication PagessplcConference Proceedingsconference-collections
research-article

Using regression testing to analyze the impact of changes to variability models on products

Published:02 September 2012Publication History

ABSTRACT

Industrial product lines are typically maintained for a long time and evolve continuously to address changing requirements and new technologies. Already derived products often have to be re-derived after such changes to benefit from new and updated features. Product line engineers thus frequently need to analyze the impact of changes to variability models to prevent unexpected changes of re-derived products. In this paper we present a tool-supported approach that informs engineers about the impacts of variability model changes on existing products. Regression tests are used to determine whether existing product configurations and generated product outputs can be re-derived without unexpected effects. We evaluate the feasibility of the approach based on changes observed in a real-world software product line. More specifically, we show how our approach helps engineers performing specific evolution tasks to analyze the change impacts on existing products. We also evaluate the performance and scalability of our approach. Our results show that variability change impact analyses can be automated using model regression testing and can help reducing the gap between domain engineering and application engineering.

References

  1. D. Astels. Test Driven development: A Practical Guide. Prentice Hall, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. Batory. Feature models, grammars, and propositional formulas. In 9th Intl. Software Product Line Conf. (SPLC 2005), pages 7--20. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. Bertolino. Software testing research: Achievements, challenges, dreams. In Future of Software Engineering (FOSE), pages 85--103. IEEE CS, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. K. Czarnecki, P. Grünbacher, R. Rabiser, K. Schmid, and A. Wasowski. Cool features and tough decisions: A comparison of variability modeling approaches. In 6th Intl. Workshop on Variability Modelling of Software-Intensive Systems, pages 173--182. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. P. da Mota Silveira Neto, I. do Carmo Machado, Y. Cavalcanti, E. de Almeida, V. Garcia, and S. de Lemos Meira. A regression testing approach for software product lines architectures. In 4th Brazilian Symposium on Software Components, Architectures and Reuse, pages 41--50. IEEE CS, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. P. A. da Mota Silveira Neto, I. do Carmo Machado, J. D. McGregor, E. S. de Almeida, and S. R. de Lemos Meira. A systematic mapping study of software product lines testing. Information and Software Technology, 53(5): 407--423, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. Dhungana, P. Grünbacher, and R. Rabiser. The DOPLER meta-tool for decision-oriented variability modeling: A multiple case study. Automated Software Engineering, 18(1): 77--114, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Y. Feng, X. Liu, and J. Kerridge. A product line based aspect-oriented generative unit testing approach to building quality components. In 31st Annual Intl. Computer Software and Applications Conf., Volume 2, pages 403--408. IEEE, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Geppert, J. Li, F. Rößler, and D. Weiss. Towards generating acceptance tests for product lines. In J. Bosch and C. Krueger, editors, Software Reuse: Methods, Techniques, and Tools, pages 35--48. Springer, 2004.Google ScholarGoogle Scholar
  10. P. Grünbacher, R. Rabiser, D. Dhungana, and M. Lehofer. Model-based customization and deployment of Eclipse-based tools: Industrial experiences. In 24th IEEE/ACM Intl. Conf. on Automated Software Engineering, pages 247--256. IEEE/ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. J. Harrold. Testing: a roadmap. In ICSE - Future of SE Track, pages 61--72. ACM, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. F. Heidenreich. Towards systematic ensuring well-formedness of software product lines. In Proc. of the 1st Intl. Workshop on Feature-Oriented Software Development, pages 69--74. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. W. Heider, P. Grünbacher, R. Rabiser, and M. Lehofer. Evolution-driven trace acquisition in Eclipse-based product line workspaces. In J. Cleland-Huang, O. Gotel, and A. Zisman, editors, Software and Systems Traceability, pages 195--213. Springer, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  14. W. Heider, R. Rabiser, and P. Grünbacher. Facilitating the evolution of products in product line engineering by capturing and replaying configuration decisions. Software Tools for Technology Transfer, 2012.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. G. T. Heineman and W. T. Council. Component-Based Software Engineering: Putting the Pieces Together. Addison-Wesley, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. S. Kang, J. Lee, M. Kim, and W. Lee. Towards a formal framework for product line test development. In 7th Intl. Conf. on Computer and Information Technology, pages 921--926. IEEE CS, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. T. Kishi and N. Noda. Formal verification and software product lines. Commun. ACM, 49: 73--77, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. P. Knauber. Managing the evolution of software product lines. In 8th Intl. Conf. on Software Reuse. Springer LNCS, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. R. Kolb and D. Muthig. Making testing product lines more efficient by improving the testability of product line architectures. In Proc. of the ISSTA workshop on Role of software architecture for testing and analysis, pages 22--27. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. B. P. Lamancha and M. P. Usaola. Testing product generation in software product lines using pairwise for features coverage. In Proc. of the 22nd IFIP WG 6.1 Intl. Conf. on Testing software and systems, pages 111--125. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. T. Männistö and R. Sulonen. Evolution of schema and individuals of configurable products. In P. Chen, D. Embley, J. Kouloumdjian, S. Liddle, and J. Roddick, editors, Advances in Conceptual Modeling, pages 12--23. Springer, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. R. Mazo, P. Grünbacher, W. Heider, R. Rabiser, C. Salinesi, and D. Diaz. Using constraint programming to verify dopler variability models. In 5th Intl. Workshop on Variability Modelling of Software-intensive Systems, pages 97--104. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. G. J. Myers. The art of software testing (2nd edition). Wiley, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. C. Nebut, S. Pickin, Y. L. Traon, and J.-M. Jezequel. Automated requirements-based generation of test cases for product families. In Intl. Conf. on Automated Software Engineering, pages 263--266. IEEE CS, 2003.Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. L. Neves, L. Teixeira, D. Sena, V. Alves, U. Kulezsa, and P. Borba. Investigating the safe evolution of software product lines. In Proc. of the 10th ACM Intl. Conf. on Generative programming and component engineering, pages 33--42. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. S. Oster, F. Markert, and P. Ritter. Automated incremental pairwise testing of software product lines. In Intl. Software Product Line Conf., pages 196--210. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. K. Pohl, G. Böckle, and F. van der Linden. Software Product Line Engineering: Foundations, Principles, and Techniques. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. R. Rabiser, P. Grünbacher, and D. Dhungana. Supporting product derivation by adapting and augmenting variability models. In 11th Intl. Software Product Line Conf., pages 141--150. IEEE CS, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. R. Rabiser, W. Heider, C. Elsner, M. Lehofer, P. Grünbacher, and C. Schwanninger. A flexible approach for generating product-specific documents in product lines. In 14th Intl. Software Product Line Conf., pages 47--61. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. X. Ren, F. Shah, F. Tip, B. G. Ryder, and O. Chesley. Chianti: a tool for change impact analysis of java programs. In Proc. of the 19th annual ACM SIGPLAN conf. on Object-oriented programming, systems, languages, and applications, pages 432--448. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. A. Reuys, E. Kamsties, K. Pohl, and S. Reis. Model-based system testing of software product families. In Advanced Information Systems Engineering, pages 379--380. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. K. Schmid, R. Rabiser, and P. Grünbacher. A comparison of decision modeling approaches in product lines. In 5th Intl. Workshop on Variability Modelling of Software-intensive Systems, pages 119--126. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. A. Schürr, S. Oster, and F. Markert. Model-driven software product line testing: An integrated approach. In SOFSEM: Theory and Practice of Computer Science, pages 112--131. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. S. Segura, R. M. Hierons, D. Benavides, and A. Ruiz-Cortes. Automated metamorphic testing on the analyses of feature models. Information and Software Technology, 53(3): 245--258, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. P. Trinidad, D. Benavides, A. Ruiz-Cortes, S. Segura, and A. Jimenez. Fama framework. In 12th Intl. Software Product Line Conf., page 359. IEEE CS, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Using regression testing to analyze the impact of changes to variability models on products

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          SPLC '12: Proceedings of the 16th International Software Product Line Conference - Volume 1
          September 2012
          310 pages
          ISBN:9781450310949
          DOI:10.1145/2362536

          Copyright © 2012 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 2 September 2012

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          SPLC '12 Paper Acceptance Rate22of66submissions,33%Overall Acceptance Rate167of463submissions,36%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader