skip to main content
10.1145/3663529.3663810acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

ModelFoundry: A Tool for DNN Modularization and On-Demand Model Reuse Inspired by the Wisdom of Software Engineering

Published: 10 July 2024 Publication History

Abstract

Reusing DNN models provides an efficient way to meet new requirements without training models from scratch. Recently, inspired by the wisdom of software reuse, on-demand model reuse has drawn much attention, which aims to reduce the overhead and security risk of model reuse via decomposing models into modules and reusing modules according to user’s requirements. However, existing efforts for on-demand model reuse mainly provide algorithm implementations without tool support. These implementations involve ad-hoc decomposition in experiments and require considerable manual efforts to adapt to new models; thus obstructing the practicality of on-demand model reuse. In this paper, we introduce, a tool that systematically integrates two modularization approaches proposed in our prior work. supports automated model decomposition and module reuse, making it more practical and easily integrated into model-sharing platforms. Evaluations conducted on widely used models sourced from PyTorch and GitHub platforms demonstrate that achieves effective model decomposition and module reuse, as well as good generalizability to various models. A demonstration is available at https://youtu.be/dXHeQ0fGldk.

References

[1]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. ImageNet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition. 248–255. https://doi.org/10.1109/CVPR.2009.5206848
[2]
Sayem Mohammad Imtiaz, Fraol Batole, Astha Singh, Rangeet Pan, Breno Dantas Cruz, and Hridesh Rajan. 2023. Decomposing a Recurrent Neural Network into Modules for Enabling Reusability and Replacement. In 2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE). 1020–1032. https://doi.org/10.1109/ICSE48619.2023.00093
[3]
TorchVision maintainers and contributors. 2016. TorchVision: PyTorch’s Computer Vision library.
[4]
Rangeet Pan and Hridesh Rajan. 2020. On decomposing a deep neural network into modules. In Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering. 889–900.
[5]
Rangeet Pan and Hridesh Rajan. 2022. Decomposing Convolutional Neural Networks into Reusable and Replaceable Modules. In 2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE). 524–535. https://doi.org/10.1145/3510003.3510051
[6]
Binhang Qi. 2023. GradSplitter. https://github.com/qibinhang/GradSplitter [Online; accessed 01-oct-2023]
[7]
Binhang Qi, Hailong Sun, Xiang Gao, and Hongyu Zhang. 2022. Patching weak convolutional neural network models through modularization and composition. In Proceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering. 1–12.
[8]
Binhang Qi, Hailong Sun, Xiang Gao, Hongyu Zhang, Zhaotian Li, and Xudong Liu. 2023. Reusing Deep Neural Network Models through Model Re-engineering. In 2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE). 983–994. https://doi.org/10.1109/ICSE48619.2023.00090
[9]
Binhang Qi, Hailong Sun, Hongyu Zhang, and Xiang Gao. 2024. Reusing Convolutional Neural Network Models through Modularization and Composition. ACM Transactions on Software Engineering and Methodology, 33, 3 (2024).
[10]
Binhang Qi, Hailong Sun, Hongyu Zhang, Ruobing Zhao, and Xiang Gao. 2024. Modularizing while Training: A New Paradigm for Modularizing DNN Models. In IEEE/ACM 46th International Conference on Software Engineering. 353–364.
[11]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Ł ukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems, 30 (2017).
[12]
Xingyi Yang, Jingwen Ye, and Xinchao Wang. 2022. Factorizing knowledge in neural networks. In European Conference on Computer Vision. 73–91.
[13]
Xingyi Yang, Daquan Zhou, Songhua Liu, Jingwen Ye, and Xinchao Wang. 2022. Deep model reassembly. Advances in neural information processing systems, 35 (2022), 25739–25753.
[14]
Xingyi Yang, Daquan Zhou, Songhua Liu, Jingwen Ye, and Xinchao Wang. 2022. Deep model reassembly. Advances in neural information processing systems, 35 (2022), 25739–25753.
[15]
Jingwen Ye, Songhua Liu, and Xinchao Wang. 2023. Partial network cloning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 20137–20146.
[16]
Ziqi Zhang, Yuanchun Li, Yao Guo, Xiangqun Chen, and Yunxin Liu. 2020. Dynamic slicing for deep neural networks. In Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2020). Association for Computing Machinery, New York, NY, USA. 838–850. isbn:9781450370431 https://doi.org/10.1145/3368089.3409676
[17]
Ziqi Zhang, Yuanchun Li, Jindong Wang, Bingyan Liu, Ding Li, Yao Guo, Xiangqun Chen, and Yunxin Liu. 2022. ReMoS: reducing defect inheritance in transfer learning via relevant model slicing. In Proceedings of the 44th International Conference on Software Engineering. 1856–1868.

Index Terms

  1. ModelFoundry: A Tool for DNN Modularization and On-Demand Model Reuse Inspired by the Wisdom of Software Engineering

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    FSE 2024: Companion Proceedings of the 32nd ACM International Conference on the Foundations of Software Engineering
    July 2024
    715 pages
    ISBN:9798400706585
    DOI:10.1145/3663529
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 July 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. DNN modularization
    2. model reuse
    3. module composition

    Qualifiers

    • Research-article

    Conference

    FSE '24
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 112 of 543 submissions, 21%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 99
      Total Downloads
    • Downloads (Last 12 months)99
    • Downloads (Last 6 weeks)24
    Reflects downloads up to 22 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media