Skip to main content
Log in

Graph-based fine-grained model selection for multi-source domain

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

The prosperity of datasets and model architectures has led to the development of pretrained source models, which simplified the learning process in multi-domain transfer learning. However, challenges such as data complexity, domain shifts, and performance limitations make it difficult to determine which source model to transfer. To meet these challenges, source model selection has emerged as a promising approach for choosing the best model for a given target domain. Most literature utilizes transferability estimation combined with statistical methods to deduce the model selection probability, which is a coarse-grained method that selects a single model with limited accuracy and applicability in multi-source domains. To break through this limitation, we propose a graph-based fine-grained multi-source model selection method (GFMS) that aims to adaptively select the best source model for any single target domain data. Specifically, our proposed method comprises three main components: building a source model library through cross-training; generating the selection strategy by exploring the similarities among the data features, the associations between the features and models based on graph neural networks; blending the selected models using a weighted approach to obtain the best model adaptively. Experimental results demonstrate that the proposed adaptive method achieves higher accuracy in both model selection and image classification than the current state-of-the-art methods on compared datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

Data will be made available on reasonable request.

References

  1. Tan Y, Li Y, Huang S-L (2021) Otce: a transferability metric for cross-domain cross-task representations. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 15779–15788

  2. Muñoz MA, Sun Y, Kirley M, Halgamuge SK (2015) Algorithm selection for black-box continuous optimization problems: a survey on methods and challenges. Inf Sci 317:224–245

    Article  Google Scholar 

  3. Emani MK, O’Boyle M (2015) Celebrating diversity: a mixture of experts approach for runtime mapping in dynamic environments. Assoc Comput Mach. https://doi.org/10.1145/2737924.2737999

    Article  Google Scholar 

  4. Sekar J, Aruchamy P, Sulaima Lebbe Abdul H, Mohammed AS, Khamuruddeen S (2022) An efficient clinical support system for heart disease prediction using Tanfis classifier. Comput. Intell. 38(2):610–640

    Article  Google Scholar 

  5. Jayachitra S, Prasanth A (2021) Multi-feature analysis for automated brain stroke classification using weighted gaussian Naïve Bayes classifier. J Circuits, Syst Comput 30(10):2150178

    Article  Google Scholar 

  6. Shao W, Zhao X, Ge Y, Zhang Z, Yang L, Wang X, Shan Y, Luo P (2022) Not all models are equal: predicting model transferability in a self-challenging fisher space. In: Computer vision—ECCV 2022: 17th European conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXIV. Springer, pp 286–302

  7. Rivolli A, Garcia LP, Soares C, Vanschoren J, de Carvalho AC (2018) Characterizing classification datasets: a study of meta-features for meta-learning. arXiv:1808.10406

  8. Cohen-Shapira N, Rokach L, Shapira B, Katz G, Vainshtein R (2019) AutoGRD: model recommendation through graphical dataset representation. https://doi.org/10.1145/3357384.3357896

  9. Cohen-Shapira N, Rokach L (2021) Automatic selection of clustering algorithms using supervised graph embedding. Inf Sci 577:824–851

    Article  MathSciNet  Google Scholar 

  10. Alcobaça E, Siqueira F, Rivolli A, Garcia LPF, Oliva JT, de Carvalho AC (2020) Mfe: towards reproducible meta-feature extraction. J Mach Learn Res 21:111–11115

    Google Scholar 

  11. Taylor B, Marco VS, Wolff W, Elkhatib Y, Wang Z (2018) Adaptive deep learning model selection on embedded systems. https://doi.org/10.1145/3211332.3211336

  12. Marco VS, Taylor B, Wang Z, Elkhatib Y (2020) Optimizing deep learning inference on embedded systems through adaptive model selection. ACM Trans Embed Comput Syst 19(1):1–28. https://doi.org/10.1145/3371154

    Article  Google Scholar 

  13. Zoph B, Le QV (2016) Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578

  14. Mazzawi H, Gonzalvo X, Kracun A, Sridhar P, Subrahmanya N, Lopez-Moreno I, Park H-J, Violette P (2019) Improving keyword spotting and language identification via neural architecture search at scale. In: Interspeech, pp 1278–1282

  15. Afridi MJ, Ross A, Shapiro EM (2018) On automated source selection for transfer learning in convolutional neural networks. Pattern Recogn 73:65–75

    Article  Google Scholar 

  16. Nguyen C, Hassner T, Seeger M, Archambeau C (2020) Leep: a new measure to evaluate transferability of learned representations. In: International conference on machine learning. PMLR, pp 7294–7305

  17. Meiseles A, Rokach L (2020) Source model selection for deep learning in the time series domain. IEEE Access 8:6190–6200

    Article  Google Scholar 

  18. Deng W, Zheng L (2021) Are labels always necessary for classifier accuracy evaluation. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2021.3136244

  19. Ma X, Zhang T, Xu C (2019) Gcan: graph convolutional adversarial network for unsupervised domain adaptation. In: 2019 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 8258–8268. https://doi.org/10.1109/CVPR.2019.00846

  20. Luo Y, Wang Z, Huang Z, Baktashmotlagh M (2020) Progressive graph learning for open-set domain adaptation. In: International conference on machine learning. PMLR, pp 6468–6478

  21. Roy S, Krivosheev E, Zhong Z, Sebe N, Ricci E (2021) Curriculum graph co-teaching for multi-target domain adaptation. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 5351–5360

  22. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Adv Neural Inf Process Syst 25:1097–1105

    Google Scholar 

  23. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556

  24. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  25. Dwivedi K, Huang J, Cichy RM, Roig G (2020) Duality diagram similarity: a generic framework for initialization selection in task transfer learning. In: European conference on computer vision. Springer, pp 497–513

  26. Dwivedi K, Roig G (2019) Representation similarity analysis for efficient task taxonomy & transfer learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 12387–12396

  27. Goodfellow I, Bengio Y, Courville A (2016) Deep learning. MIT Press, Cambridge, p 02142. http://www.deeplearningbook.org

  28. Zhou K, Yang Y, Hospedales T, Xiang T (2020) Deep domain-adversarial image generation for domain generalisation. In: Proceedings of the AAAI conference on artificial intelligence, vol 34, pp 13025–13032

  29. Saenko K, Kulis B, Fritz M, Darrell T (2010) Adapting visual category models to new domains. In: European conference on computer vision. Springer, pp 213–226

  30. Zhou K, Yang Y, Qiao Y, Xiang T (2021) Domain adaptive ensemble learning. IEEE Trans Image Process 30:8008–8018. https://doi.org/10.1109/tip.2021.3112012

    Article  Google Scholar 

  31. Peng X, Bai Q, Xia X, Huang Z, Saenko K, Wang B (2019) Moment matching for multi-source domain adaptation. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 1406–1415

  32. Iandola FN, Han S, Moskewicz MW, Ashraf K, Dally WJ, Keutzer K (2016) Squeezenet: alexnet-level accuracy with 50x fewer parameters and < 0.5 mb model size. arXiv:1602.07360

  33. Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708

  34. Sandler M, Howard A, Zhu M, Zhmoginov A, Chen LC (2018) Mobilenetv2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4510–4520

  35. Chen D, Lin Y, Li W, Li P, Zhou J, Sun X (2020) Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. Proc AAAI Confer Artif Intell 34(04):3438–3445. https://doi.org/10.1609/aaai.v34i04.5747

    Article  Google Scholar 

  36. Ali Z, Qi G, Muhammad K, Ali B, Abro WA (2020) Paper recommendation based on heterogeneous network embedding. Knowl-Based Syst 210:106438

    Article  Google Scholar 

  37. Sun M, Cho S (2018) Obtaining calibrated probability using roc binning. Pattern Anal Appl 21:307–322

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work is funded by The National Natural Science Foundation of China (No. 62172442).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Yuhang Huang or Meiguang Zheng.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, Z., Huang, Y., Zheng, H. et al. Graph-based fine-grained model selection for multi-source domain. Pattern Anal Applic 26, 1481–1492 (2023). https://doi.org/10.1007/s10044-023-01176-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-023-01176-6

Keywords

Navigation