Skip to main content

DrPython–WEB: A Tool to Help Teaching Well-Written Python Programs

  • Conference paper
  • First Online:
Software Engineering and Formal Methods. SEFM 2021 Collocated Workshops (SEFM 2021)

Abstract

A good percentage of students, while learning how to program for the first time in a higher education course, often write inelegant code, i.e., code which is difficult to read, badly organized, not commented. Writing inelegant code reduces the student’s professional opportunities, and is an indication of a non-systematic programming style which makes it very difficult to maintain (or even understand) the code later, even by its own author. In this paper we present DrPython–WEB, a web application capable to automatically extract linguistic, structural and style-related features, from students’ programs and to grade them with respect to a teacher-defined assessment rubric. The aim of DrPython–WEB is to make the students accustomed to good coding practices, and stylistic features, and make their code better. There are other systems able to perform code analysis through quality measures: the novelty of DrPython–WEB, with respect to such systems, is in that it analyzes also linguistic and stylistic features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://wiki.c2.com/?CodeSmell, accessed 1/11/21.

  2. 2.

    https://redbaron.readthedocs.io, accessed 1/11/21.

  3. 3.

    https://radon.readthedocs.io, accessed 1/11/21.

  4. 4.

    https://spacy.io, accessed 1/11/21.

  5. 5.

    LAMP=Linux, Apache, MySQL, PHP/Perl/Python.

References

  1. Breuker, D.M., Derriks, J., Brunekreef, J.: Measuring static quality of student code. In: Proceedings of the 16th Annual Joint Conference on Innovation and Technology in Computer Science Education, pp. 13–17 (2011)

    Google Scholar 

  2. Lu, Y., Mao, X., Wang, T., Yin, G., Li, Z.: Improving students’ programming quality with the continuous inspection process: a social coding perspective. Front. Comput. Sci. 14(5), 1–18 (2019). https://doi.org/10.1007/s11704-019-9023-2

    Article  Google Scholar 

  3. Radermacher, A, Walia, G., Knudson D.: Investigating the skill gap between graduating students and industry expectations. In: Proceedings of the 36th International Conference on Software Engineering Companion, pp. 291–300 (2014)

    Google Scholar 

  4. Feldman, Y.A.: Teaching quality object-oriented programming. Technol. Educ. Resour. Comput. 5(1), 1 (2005)

    Article  Google Scholar 

  5. Chen, W.K., Tu, P.Y.: Grading code quality of programming assignments based on bad smells. In: Proceedings of the 24th IEEE-CS Conference on Software Engineering Education and Training, p. 559 (2011)

    Google Scholar 

  6. Chakraverty, S., Chakraborty, P.: Tools and techniques for teaching computer programming: a review. J. Educ. Technol. Syst. 49(2), 170–198 (2020)

    Article  Google Scholar 

  7. Rogers, S., Garcia, D., Canny, J.F., Tang, S., Kang, D.: ACES: automatic evaluation of coding style. Doctoral dissertation, Master’s thesis, EECS Department, University of California, Berkeley (2014)

    Google Scholar 

  8. Wang, A., Alsam, A., Morrison, D., Strand, K. A.: Toward automatic feedback of coding style for programming courses. In: 2021 International Conference on Advanced Learning Technologies (ICALT), pp. 33–35 (2021)

    Google Scholar 

  9. Alves, N.D.C., von Wangenheim, C.G., Hauck, J.C.R., Borgatto, A.F.: a large-scale evaluation of a rubric for the automatic assessment of algorithms and programming concepts. In: Proceedings of the 51st ACM Technical Symposium on Computer Science Education. Association for Computing Machinery, New York, NY, USA, pp. 556–562 (2020)

    Google Scholar 

  10. Caiza, J.C., Del Alamo, J.M.: Programming assignments automatic grading: review of tools and implementations. In: 7th International Technology, Education and Development Conference (2013)

    Google Scholar 

  11. Moreno-León, J., Robles, G.: Dr. Scratch: a web tool to automatically evaluate Scratch projects. In: Proceedings of the Workshop in Primary and Secondary Computing Education, pp. 132–133 (2015)

    Google Scholar 

  12. Papandrea, S., Sterbini, A., Temperini, M., Popescu, E.: Q2A-I: a support platform for computer programming education, based on automated assessment and peer learning. In: Hancke, G., Spaniol, M., Osathanunkul, K., Unankard, S., Klamma, R. (eds.) ICWL 2018. LNCS, vol. 11007, pp. 3–13. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-96565-9_1

    Chapter  Google Scholar 

  13. McCabe, T.J.: A complexity measure. IEEE Trans. Softw. Eng. (4), 308–320 (1976) https://doi.org/10.1109/tse.1976.233837

  14. Halstead, M.H.: Elements of Software Science. Elsevier North-Holland, Inc., Amsterdam (1977). ISBN 0-444-00205-7

    Google Scholar 

  15. Lu, K: Python automated term extraction (version v0.5.3). Zenodo (2021). https://doi.org/10.5281/zenodo.5039289

  16. Astrakhantsev, N.: Methods and software for terminology extraction from domain-specific text collection. Ph.D. thesis, Institute for System Programming of Russian Academy of Sciences (2015)

    Google Scholar 

  17. Radermacher, A.: Evaluating the gap between the skills and abilities of graduating computer science students and the expectation of industry, Master’s thesis, North Dakota State University (2012)

    Google Scholar 

  18. Sterbini, A., Temperini, M.: Collaborative projects and self evaluation within a social reputation-based exercise-sharing system. In: Proceedings-2009 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology-Workshops, WI-IAT Workshops, pp. 243–246 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Andrea Sterbini or Marco Temperini .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Battistini, T., Isaia, N., Sterbini, A., Temperini, M. (2022). DrPython–WEB: A Tool to Help Teaching Well-Written Python Programs. In: Cerone, A., et al. Software Engineering and Formal Methods. SEFM 2021 Collocated Workshops. SEFM 2021. Lecture Notes in Computer Science, vol 13230. Springer, Cham. https://doi.org/10.1007/978-3-031-12429-7_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-12429-7_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-12428-0

  • Online ISBN: 978-3-031-12429-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics