Abstract
In order to create reproducible experimentation and algorithms in machine learning and data mining research, reproducible descriptions of the algorithms are needed. These can be in the form of source code, pseudo code and prose. Efforts in academia commonly focus on accessibility of source code. Based on an internal study reproducing unsupervised concept drift detectors, this work argues that a publication’s content is equally important and highlights common issues affecting attempts at implementing unsupervised concept drift detectors. These include major issues prohibiting implementation entirely, as well as minor issues, which demand increased effort from the developer. The paper proposes the use of a checklist as a consistent tool to ensure better quality and reproducible publications of algorithms. The issues highlighted in this work could mark a starting point, although future work is required to ensure representation of more diverse areas of research in artificial intelligence.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bellamy, R.K.: What does pseudo-code do? A psychological analysis of the use of pseudo-code by experienced programmers. Hum. Comput. Interact. 9(2), 225–246 (1994)
Gawande, A.: The Checklist Manifesto. Metropolitan Books, New Delhi (2009)
Heil, B.J., Hoffman, M.M., Markowetz, F., Lee, S.I., Greene, C.S., Hicks, S.C.: Reproducibility standards for machine learning in the life sciences. Nat. Meth. 18(10), 1132–1135 (2021). https://doi.org/10.1038/s41592-021-01256-7
Knuth, D.E.: Literate programming. Comput. J. 27(2), 97–111 (1984). https://doi.org/10.1093/comjnl/27.2.97
Licensing a Repository. https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/licensing-a-repository. Accessed 26 June 2023
No License | Choose a License. https://choosealicense.com/no-permission. Accessed 26 June 2023
Olorisade, B.K., Brereton, P., Andras, P.: Reproducibility of studies on text mining for citation screening in systematic reviews: evaluation and checklist. J. Biomed. Inform. 73, 1–13 (2017). https://doi.org/10.1016/j.jbi.2017.07.010
Papers with Code. https://paperswithcode.com. Accessed 26 June 2023
Pineau, J., et al.: Improving reproducibility in machine learning research (a report from the NeurIPS 2019 reproducibility program). J. Mach. Learn. Res. 22, 7459–7478 (2021)
Pérez, Fernando, K.J.M.: Developing open-source scientific practice. In: Implementing Reproducible Research. Chapman and Hall/CRC (2014)
Reproducibility Checklist | AAAI 2023 Conference. https://aaai-23.aaai.org/reproducibility-checklist. Accessed 26 June 2023
Reproducibility in Machine Learning. https://sites.google.com/view/icml-reproducibility-workshop/home. Accessed 26 June 2023
Shireman, E., Steinley, D., Brusco, M.J.: Examining the effect of initialization strategies on the performance of Gaussian mixture modeling. Behav. Res. Meth. 49(1), 282–293 (2016). https://doi.org/10.3758/s13428-015-0697-6
Tips for Publishing Research Code. https://github.com/paperswithcode/releasing-research-code. Accessed 26 June 2023
Wolter, K.M.: Introduction to Variance Estimation. Statistics for Social and Behavioral Sciences, Springer, New York, NY (2007). https://doi.org/10.1007/978-0-387-35099-8
Acknowledgements
This paper has received partial funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000825 (NAUTILOS).This work also received partial funding from Niedersächsisches Vorab under grant number ZN3683 (ChESS).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Lukats, D., Stahl, F. (2023). On Reproducible Implementations in Unsupervised Concept Drift Detection Algorithms Research. In: Bramer, M., Stahl, F. (eds) Artificial Intelligence XL. SGAI 2023. Lecture Notes in Computer Science(), vol 14381. Springer, Cham. https://doi.org/10.1007/978-3-031-47994-6_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-47994-6_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-47993-9
Online ISBN: 978-3-031-47994-6
eBook Packages: Computer ScienceComputer Science (R0)