Skip to main content

Assessing the User-Perceived Quality of Source Code Components Using Static Analysis Metrics

  • Conference paper
  • First Online:
Software Technologies (ICSOFT 2017)

Abstract

Nowadays, developers tend to adopt a component-based software engineering approach, reusing own implementations and/or resorting to third-party source code. This practice is in principle cost-effective, however it may also lead to low quality software products, if the components to be reused exhibit low quality. Thus, several approaches have been developed to measure the quality of software components. Most of them, however, rely on the aid of experts for defining target quality scores and deriving metric thresholds, leading to results that are context-dependent and subjective. In this work, we build a mechanism that employs static analysis metrics extracted from GitHub projects and defines a target quality score based on repositories’ stars and forks, which indicate their adoption/acceptance by developers. Upon removing outliers with a one-class classifier, we employ Principal Feature Analysis and examine the semantics among metrics to provide an analysis on five axes for source code components (classes or packages): complexity, coupling, size, degree of inheritance, and quality of documentation. Neural networks are thus applied to estimate the final quality score given metrics from these axes. Preliminary evaluation indicates that our approach effectively estimates software quality at both class and package levels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alves, T.L., Ypma, C., Visser, J.: Deriving metric thresholds from benchmark data. In: IEEE International Conference on Software Maintenance (ICSM), pp. 1–10. IEEE (2010)

    Google Scholar 

  2. Cai, T., Lyu, M.R., Wong, K.F., Wong, M.: ComPARE: a generic quality assessment environment for component-based software systems. In: proceedings of the 2001 International Symposium on Information Systems and Engineering (2001)

    Google Scholar 

  3. Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20(6), 476–493 (1994)

    Article  Google Scholar 

  4. Diamantopoulos, T., Thomopoulos, K., Symeonidis, A.: QualBoa: reusability-aware recommendations of source code components. In: IEEE/ACM 13th Working Conference on Mining Software Repositories (MSR), pp. 488–491. IEEE (2016)

    Google Scholar 

  5. Dimaridou, V., Kyprianidis, A.C., Papamichail, M., Diamantopoulos, T., Symeonidis, A.: Towards modeling the user-perceived quality of source code using static analysis metrics. In: 12th International Conference on Software Technologies (ICSOFT), Madrid, Spain, pp. 73–84 (2017)

    Google Scholar 

  6. Ferreira, K.A., Bigonha, M.A., Bigonha, R.S., Mendes, L.F., Almeida, H.C.: Identifying thresholds for object-oriented software metrics. J. Syst. Softw. 85(2), 244–257 (2012)

    Article  Google Scholar 

  7. Foucault, M., Palyart, M., Falleri, J.R., Blanc, X.: Computing contextual metric thresholds. In: Proceedings of the 29th Annual ACM Symposium on Applied Computing, pp. 1120–1125. ACM (2014)

    Google Scholar 

  8. Hegedűs, P., Bakota, T., Ladányi, G., Faragó, C., Ferenc, R.: A drill-down approach for measuring maintainability at source code element level. Electron. Commun. EASST 60 (2013)

    Google Scholar 

  9. Heitlager, I., Kuipers, T., Visser, J.: A practical model for measuring maintainability. In: 6th International Conference on the Quality of Information and Communications Technology, QUATIC 2007, pp. 30–39. IEEE (2007)

    Google Scholar 

  10. ISO/IEC 25010:2011 (2011). https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en. Accessed Nov 2017

  11. Kanellopoulos, Y., Antonellis, P., Antoniou, D., Makris, C., Theodoridis, E., Tjortjis, C., Tsirakis, N.: Code quality evaluation methodology using the ISO/IEC 9126 standard. Int. J. Softw. Eng. Appl. 1(3), 17–36 (2010)

    Google Scholar 

  12. Le Goues, C., Weimer, W.: Measuring code quality to improve specification mining. IEEE Trans. Softw. Eng. 38(1), 175–190 (2012)

    Article  Google Scholar 

  13. Lu, Y., Cohen, I., Zhou, X.S., Tian, Q.: Feature selection using principal feature analysis. In: Proceedings of the 15th ACM International Conference on Multimedia, pp. 301–304. ACM (2007)

    Google Scholar 

  14. Miguel, J.P., Mauricio, D., RodrĂ­guez, G.: A review of software quality models for the evaluation of software products. arXiv preprint arXiv:1412.2977 (2014)

  15. Papamichail, M., Diamantopoulos, T., Symeonidis, A.: User-perceived source code quality estimation based on static analysis metrics. In: IEEE International Conference on Software Quality, Reliability and Security (QRS), pp. 100–107. IEEE (2016)

    Google Scholar 

  16. Pfleeger, S.L., Atlee, J.M.: Software Engineering: Theory and Practice. Pearson Education India, Delhi (1998)

    Google Scholar 

  17. Pfleeger, S., Kitchenham, B.: Software quality: the elusive target. IEEE Softw. 13, 12–21 (1996)

    Article  Google Scholar 

  18. Samoladas, I., Gousios, G., Spinellis, D., Stamelos, I.: The SQO-OSS quality model: measurement based open source software evaluation. In: Russo, B., Damiani, E., Hissam, S., Lundell, B., Succi, G. (eds.) OSS 2008. ITIFIP, vol. 275, pp. 237–248. Springer, Boston, MA (2008). https://doi.org/10.1007/978-0-387-09684-1_19

    Chapter  Google Scholar 

  19. Schmidt, C.: Agile Software Development Teams. Progress in IS. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-26057-0

    Book  Google Scholar 

  20. Shatnawi, R., Li, W., Swain, J., Newman, T.: Finding software metrics threshold values using ROC curves. J. Softw.: Evol. Process 22(1), 1–16 (2010)

    Article  Google Scholar 

  21. SourceMeter static analysis tool (2017). https://www.sourcemeter.com/. Accessed Nov 2017

  22. Taibi, F.: Empirical analysis of the reusability of object-oriented program code in open-source software. Int. J. Comput. Inf. Syst. Control Eng. 8(1), 114–120 (2014)

    Google Scholar 

  23. Washizaki, H., Namiki, R., Fukuoka, T., Harada, Y., Watanabe, H.: A framework for measuring and evaluating program source code quality. In: Münch, J., Abrahamsson, P. (eds.) PROFES 2007. LNCS, vol. 4589, pp. 284–299. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73460-4_26

    Chapter  Google Scholar 

  24. Zhong, S., Khoshgoftaar, T.M., Seliya, N.: Unsupervised learning for expert-based software quality estimation. In: HASE, pp. 149–155 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Themistoklis Diamantopoulos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dimaridou, V., Kyprianidis, AC., Papamichail, M., Diamantopoulos, T., Symeonidis, A. (2018). Assessing the User-Perceived Quality of Source Code Components Using Static Analysis Metrics. In: Cabello, E., Cardoso, J., Maciaszek, L., van Sinderen, M. (eds) Software Technologies. ICSOFT 2017. Communications in Computer and Information Science, vol 868. Springer, Cham. https://doi.org/10.1007/978-3-319-93641-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-93641-3_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-93640-6

  • Online ISBN: 978-3-319-93641-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics