skip to main content
10.1145/3422392.3422443acmotherconferencesArticle/Chapter ViewAbstractPublication PagessbesConference Proceedingsconference-collections
research-article

Revealing the Social Aspects of Design Decay: A Retrospective Study of Pull Requests

Published:21 December 2020Publication History

ABSTRACT

The pull-based development model is widely used in source-code environments like GitHub. In this model, developers actively communicate and share their knowledge or opinions through the exchange of comments. Their goal is to improve the change under development, including its positive impact on design structure. In this context, two central social aspects may contribute to combating or adversely amplifying design decay. First, design decay may be avoided, reduced or accelerated depending whether the communication dynamics among developers - who play specific roles - is fluent and consistent along a change. Second, the discussion content itself may be decisive to either improve or deteriorate the structural design of a system. Unfortunately, there is no study on the the role that key social aspects play on avoiding or amplifying design decay. Previous work either investigates technical aspects of design decay or confirms the high frequency of design discussions in pull-based software development. This paper reports a retrospective study aimed at understanding the role of communication dynamics and discussion content on design decay. We focused our analysis on 11 social metrics related to these two aspects as well as 4 control technical metrics typically used as indicators of design decay. We analyzed more than 11k pull request discussions mined from five large open-source software systems. Our findings reveal that many social metrics can be used to discriminate between design impactful and unimpactful pull requests. Second, various factors of communication dynamics are related to design decay. However, temporal factors of communication dynamics outperformed the participant roles' factors as indicators of design decay. Finally, we noticed certain social metrics tend to be indicators of design decay when analyzing both aspects together.

References

  1. Mamdouh Alenezi and Mohammad Zarour. 2018. An empirical study of bad smells during software evolution using designite tool. i-Manager's Journal on Software Engineering 12, 4 (2018), 12.Google ScholarGoogle Scholar
  2. Caio Barbosa. 2020. Replication Package. Available at: https://guriosam.github.io/revealing_social_aspects_of_design_decay/.Google ScholarGoogle Scholar
  3. Nicolas Bettenburg and Ahmed E Hassan. 2013. Studying the impact of social interactions on software quality. Emp. Softw. Eng. (ESE) 18, 2 (2013), 375--431.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Christian Bird, Alex Gourley, Prem Devanbu, Anand Swaminathan, and Greta Hsu. 2007. Open borders? immigration in open source projects. In 14th MSR. 6--6.Google ScholarGoogle Scholar
  5. Christian Bird, Nachiappan Nagappan, Brendan Murphy, Harald Gall, and Premkumar Devanbu. 2011. Don't touch my code! Examining the effects of ownership on software quality. In 13th FSE. 4--14.Google ScholarGoogle Scholar
  6. João Brunet, Gail C Murphy, Ricardo Terra, Jorge Figueiredo, and Dalton Serey. 2014. Do developers discuss design?. In 11th MSR. ACM, 340--343.Google ScholarGoogle Scholar
  7. Laura Dabbish, Colleen Stuart, Jason Tsay, and Jim Herbsleb. 2012. Social coding in GitHub: transparency and collaboration in an open software repository. In 15th CSCW. 1277--1286.Google ScholarGoogle Scholar
  8. Rafael de Mello, Anderson Uchôa, Roberto Oliveira, Willian Oizumi, Jairo Souza, Kleyson Mendes, Daniel Oliveira, Baldoino Fonseca, and Alessandro Garcia. 2019. Do Research and Practice of Code Smell Identification Walk Together? A Social Representations Analysis. In 13th ESEM. 1--6.Google ScholarGoogle Scholar
  9. Carsten F Dormann, Jane Elith, Sven Bacher, Carsten Buchmann, Gudrun Carl, Gabriel Carré, Jaime R García Marquéz, Bernd Gruber, Bruno Lafourcade, Pedro J Leitão, et al. 2013. Collinearity: a review of methods to deal with it and a simulation study evaluating their performance. Ecography 36, 1 (2013), 27--46.Google ScholarGoogle ScholarCross RefCross Ref
  10. Anthony WF Edwards. 1963. The measure of association in a 2× 2 table. J. Royal Stat. Soc. 126, 1 (1963), 109--114.Google ScholarGoogle ScholarCross RefCross Ref
  11. Andre Eposhi, Willian Oizumi, Alessandro Garcia, Leonardo Sousa, Roberto Oliveira, and Anderson Oliveira. 2019. Removal of design problems through refactorings: Are we looking at the right symptoms?. In 27th ICPC. 148--153.Google ScholarGoogle Scholar
  12. Filipe Falcão, Caio Barbosa, Baldoino Fonseca, Alessandro Garcia, Márcio Ribeiro, and Rohit Gheyi. 2020. On Relating Technical, Social Factors, and the Introduction of Bugs. In 27th SANER. 378--388.Google ScholarGoogle Scholar
  13. Martin Fowler. 1999. Refactoring. Addison-Wesley Professional.Google ScholarGoogle Scholar
  14. Georgios Gousios, Martin Pinzger, and Arie van Deursen. 2014. An exploratory study of the pull-based software development model. In 36th ICSE. 345--355.Google ScholarGoogle Scholar
  15. Georgios Gousios, Andy Zaidman, Margaret-Anne Storey, and Arie Van Deursen. 2015. Work practices and challenges in pull-based development: the integrator's perspective. In 37th ICSE, Vol. 1. 358--368.Google ScholarGoogle ScholarCross RefCross Ref
  16. Antoine Guisan and Niklaus E Zimmermann. 2000. Predictive habitat distribution models in ecology. Ecological modelling 135, 2-3 (2000), 147--186.Google ScholarGoogle Scholar
  17. Ahmed E Hassan. 2009. Predicting faults using the complexity of code changes. In 31st ICSE. 78--88.Google ScholarGoogle Scholar
  18. Mario Hozano, Alessandro Garcia, Baldoino Fonseca, and Evandro Costa. 2018. Are you smelling it? Investigating how similar developers detect code smells. Inf. Softw. Technol. (IST) 93 (2018), 130--146.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Eirini Kalliamvakou, Georgios Gousios, Kelly Blincoe, Leif Singer, Daniel M German, and Daniela Damian. 2016. An in-depth study of the promises and perils of mining GitHub. Emp. Softw. Eng. (ESE) 21, 5 (2016), 2035--2071.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Foutse Khomh, Massimiliano Di Penta, Yann-Gaël Guéhéneuc, and Giuliano Antoniol. 2012. An exploratory study of the impact of antipatterns on class change-and fault-proneness. Emp. Softw. Eng. (ESE) 17, 3 (2012), 243--275.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Robert C. Martin and Micah Martin. 2006. Agile Principles, Patterns, and Practices in C# (Robert C. Martin). Prentice Hall PTR, Upper Saddle River, NJ, USA.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. John H McDonald. 2009. Handbook of biological statistics. Vol. 2. Sparky House Publishing.Google ScholarGoogle Scholar
  23. Rafael Mello, Roberto Oliveira, Leonardo Sousa, and Alessandro Garcia. 2017. Towards effective teams for the identification of code smells. In 10th CHASE. 62--65.Google ScholarGoogle Scholar
  24. Andrew Meneely, Alberto C. Rodriguez Tejeda, Brian Spates, Shannon Trudeau, Danielle Neuberger, Katherine Whitlock, Christopher Ketant, and Kayla Davis. 2014. An Empirical Investigation of Socio-Technical Code Review Metrics and Security Vulnerabilities. In 6th SSE. 37--44.Google ScholarGoogle Scholar
  25. W Oizumi, A Garcia, L Sousa, B Cafeo, and Y Zhao. 2016. Code Anomalies Flock Together: Exploring Code Anomaly Agglomerations for Locating Design Problems. In 38th ICSE.Google ScholarGoogle Scholar
  26. Willian Oizumi, Leonardo Sousa, Anderson Oliveira, Luiz Carvalho, Alessandro Garcia, Thelma Colanzi, and Roberto Oliveira. 2019. On the density and diversity of degradation symptoms in refactored classes: A multi-case study. In 30th ISSRE.Google ScholarGoogle Scholar
  27. Willian Oizumi, Leonardo Sousa, Anderson Oliveira, Alessandro Garcia, Anne Benedicte Agbachi, Roberto Oliveira, and Carlos Lucena. 2018. On the identification of design problems in stinky code: experiences and tool support. J. Braz. Comput. Soc. 24, 1 (2018), 13.Google ScholarGoogle ScholarCross RefCross Ref
  28. Gustavo Ansaldi Oliva, Igor Steinmacher, Igor Wiese, and Marco Aurélio Gerosa. 2013. What can commit metadata tell us about design degradation?. In 13th IWPSE. 18--27.Google ScholarGoogle Scholar
  29. Anderson Oliveira, Leonardo Sousa, Willian Oizumi, and Alessandro Garcia. 2019. On the Prioritization of Design-Relevant Smelly Elements: A Mixed-Method, Multi-Project Study. In 13th SBCARS. 83--92.Google ScholarGoogle Scholar
  30. Matheus Paixao and Paulo Henrique Maia. 2020. Rebasing in Code Review Considered Harmful: A Large-Scale Empirical Investigation. In 19th SCAM. 45--55.Google ScholarGoogle Scholar
  31. Matheus Paixão, Anderson Uchôa, Ana Carla Bibiano, Daniel Oliveira, Alessandro Garcia, Jens Krinke, and Emilio Arvonio. 2020. Behind the Intents: An In-depth Empirical Study on Software Refactoring in Modern Code Review. In 17th MSR.Google ScholarGoogle Scholar
  32. Fabio Palomba, Gabriele Bavota, Massimiliano Di Penta, Fausto Fasano, Rocco Oliveto, and Andrea De Lucia. 2018. On the diffuseness and the impact on maintainability of code smells: a large scale empirical investigation. Emp. Softw. Eng. (ESE) 23, 3 (2018), 1188--1221.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Fabio Palomba, Annibale Panichella, Andy Zaidman, Rocco Oliveto, and Andrea De Lucia. 2017. The scent of a smell: An extensive comparison between textual and structural smells. IEEE Trans. Softw. Eng. (TSE) 44, 10 (2017), 977--1000.Google ScholarGoogle ScholarCross RefCross Ref
  34. Mohammad Masudur Rahman and Chanchal K Roy. 2014. An insight into the pull requests of github. In 11th MSR. 364--367.Google ScholarGoogle Scholar
  35. Shade Ruangwan, Patanamon Thongtanunam, Akinori Ihara, and Kenichi Matsumoto. 2019. The impact of human factors on the participation decision of reviewers in modern code review. Emp. Softw. Eng. (ESE) 24, 2 (2019), 973--1016.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Tushar Sharma, Pratibha Mishra, and Rohit Tiwari. 2016. Designite: a software design quality assessment tool. In 1st BRIDGE. 1--4.Google ScholarGoogle Scholar
  37. Tushar Sharma, Paramvir Singh, and Diomidis Spinellis. 2020. An empirical investigation on the relationship between design and architecture smells. Emp. Softw. Eng. (ESE) (2020).Google ScholarGoogle Scholar
  38. Tushar Sharma and Diomidis Spinellis. 2018. A survey on software smells. J. Syst. Softw. (JSS) 138 (2018), 158--173.Google ScholarGoogle ScholarCross RefCross Ref
  39. Marcelino Campos Oliveira Silva, Marco Tulio Valente, and Ricardo Terra. 2016. Does technical debt lead to the rejection of pull requests?. In 12th SBSI.Google ScholarGoogle Scholar
  40. Leonardo Sousa, Anderson Oliveira, Willian Oizumi, Simone Barbosa, Alessandro Garcia, Jaejoon Lee, Marcos Kalinowski, Rafael de Mello, Baldoino Fonseca, Roberto Oliveira, et al. 2018. Identifying design problems in the source code: A grounded theory. In 40th ICSE. 921--931.Google ScholarGoogle Scholar
  41. Antony Tang, Aldeida Aleti, Janet Burge, and Hans van Vliet. 2010. What makes software design effective? Design Studies 31, 6 (2010), 614--640.Google ScholarGoogle ScholarCross RefCross Ref
  42. Richard N Taylor and Andre Van der Hoek. 2007. Software design and architecture the once and future focus of software engineering. In FOSE'07. IEEE, 226--243.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Jason Tsay, Laura Dabbish, and James Herbsleb. 2014. Influence of social and technical factors for evaluating contribution in GitHub. In 36th ICSE. 356--366.Google ScholarGoogle Scholar
  44. Jason Tsay, Laura Dabbish, and James Herbsleb. 2014. Let's talk about it: evaluating contributions through discussion in GitHub. In 22nd FSE. 144--154.Google ScholarGoogle Scholar
  45. Anderson Uchôa, Caio Barbosa, Willian Oizumi, Publio Blenilio, Rafael Lima, Alessandro Garcia, and Carla Bezerra. 2020. How Does Modern Code Review Impact Software Design Degradation? An In-depth Empirical Study. In 36th ICSME. 1--12.Google ScholarGoogle Scholar
  46. Elise Whitley and Jonathan Ball. 2002. Statistics Review 6: Nonparametric methods. Critical care 6, 6 (2002), 509.Google ScholarGoogle ScholarCross RefCross Ref
  47. Igor Scaliante Wiese, Filipe Roseiro Côgo, Reginaldo Ré, Igor Steinmacher, and Marco Aurélio Gerosa. 2014. Social metrics included in prediction models on software engineering: a mapping study. In 10th PROMISE. 72--81.Google ScholarGoogle Scholar
  48. Claes Wohlin, Per Runeson, Martin Höst, Magnus Ohlsson, Björn Regnell, and Anders Wesslén. 2012. Experimentation in Software Engineering (1st ed.). Springer Science & Business Media.Google ScholarGoogle ScholarCross RefCross Ref
  49. Aiko Yamashita, Marco Zanoni, Francesca Arcelli Fontana, and Bartosz Walter. 2015. Inter-smell relations in industrial and open source systems: A replication and comparative analysis. In 31st ICSME. 121--130.Google ScholarGoogle Scholar
  50. Yue Yu, Gang Yin, Huaimin Wang, and Tao Wang. 2014. Exploring the patterns of social behavior in GitHub. In 1st CrowdSoft. 31--36.Google ScholarGoogle Scholar
  51. Farida El Zanaty, Toshiki Hirao, Shane McIntosh, Akinori Ihara, and Kenichi Matsumoto. 2018. An empirical study of design discussions in code review. In 12th ESEM. ACM, 11.Google ScholarGoogle Scholar

Index Terms

  1. Revealing the Social Aspects of Design Decay: A Retrospective Study of Pull Requests

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        SBES '20: Proceedings of the XXXIV Brazilian Symposium on Software Engineering
        October 2020
        901 pages
        ISBN:9781450387538
        DOI:10.1145/3422392

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 21 December 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate147of427submissions,34%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader