Abstract
The explorative and iterative nature of developing and operating ML applications leads to a variety of artifacts, such as datasets, features, models, hyperparameters, metrics, software, configurations, and logs. In order to enable comparability, reproducibility, and traceability of these artifacts across the ML lifecycle steps and iterations, systems and tools have been developed to support their collection, storage, and management. It is often not obvious what precise functional scope such systems offer so that the comparison and the estimation of synergy effects between candidates are quite challenging. In this paper, we aim to give an overview of systems and platforms which support the management of ML lifecycle artifacts. Based on a systematic literature review, we derive assessment criteria and apply them to a representative selection of more than 60 systems and platforms.
Recommendations
Assuring the Machine Learning Lifecycle: Desiderata, Methods, and Challenges
Machine learning has evolved into an enabling technology for a wide range of highly successful applications. The potential for this success to continue and accelerate has placed machine learning (ML) at the top of research, economic, and political ...
Eye-blinking artefacts analysis
CompSysTech '07: Proceedings of the 2007 international conference on Computer systems and technologiesBrain computer interface (BCI) is a device which allows the people to communicate without using their mouths or hands. The information about the subject's intention is issued by his brain and exists in his electroencephalogram (EEG), recorded from the ...
Universal Resource Lifecycle Management
ICDE '09: Proceedings of the 2009 IEEE International Conference on Data EngineeringThis paper presents a model and a tool that allows Web users to define, execute, and manage lifecycles for any artifact available on the Web. In the paper we show the need for lifecycle management of Web artifacts, and we show in particular why it is ...
Comments