Skip to main content
Log in

Source-Free Unsupervised Domain Adaptation with Sample Transport Learning

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Unsupervised domain adaptation (UDA) has achieved great success in handling cross-domain machine learning applications. It typically benefits the model training of unlabeled target domain by leveraging knowledge from labeled source domain. For this purpose, the minimization of the marginal distribution divergence and conditional distribution divergence between the source and the target domain is widely adopted in existing work. Nevertheless, for the sake of privacy preservation, the source domain is usually not provided with training data but trained predictor (e.g., classifier). This incurs the above studies infeasible because the marginal and conditional distributions of the source domain are incalculable. To this end, this article proposes a source-free UDA which jointly models domain adaptation and sample transport learning, namely Sample Transport Domain Adaptation (STDA). Specifically, STDA constructs the pseudo source domain according to the aggregated decision boundaries of multiple source classifiers made on the target domain. Then, it refines the pseudo source domain by augmenting it through transporting those target samples with high confidence, and consequently generates labels for the target domain. We train the STDA model by performing domain adaptation with sample transport between the above steps in alternating manner, and eventually achieve knowledge adaptation to the target domain and attain confident labels for it. Finally, evaluation results have validated effectiveness and superiority of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2009, 22(10): 1345-1359. DOI: https://doi.org/10.1109/TKDE.2009.191.

    Article  Google Scholar 

  2. Yan H, Ding Y, Li P, Wang Q, Xu Y, Zuo W. Mind the class weight bias: Weighted maximum mean discrepancy for unsupervised domain adaptation. In Proc. the 2017 IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.2272-2281. DOI: 10.1109/CVPR.2017.107.

  3. Tahmoresnezhad J, Hashemi S. Visual domain adaptation via transfer feature learning. Knowledge and Information Systems, 2017, 50(2): 585-605. DOI: https://doi.org/10.1007/s10115-016-0944-x.

    Article  Google Scholar 

  4. Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V. Domain-adversarial training of neural networks. The Journal of Machine Learning Research, 2016, 17(1): 2096-2030. DOI: https://doi.org/10.1007/978-3-319-58347-1_10.

    Article  MathSciNet  MATH  Google Scholar 

  5. Ganin Y, Lempitsky V. Unsupervised domain adaptation by backpropagation. In Proc. the 32nd International Conference on Machine Learning, July 2015, pp.1180-1189.

  6. Saito K, Watanabe K, Ushiku Y, Harada T. Maximum classifier discrepancy for unsupervised domain adaptation. In Proc. the 2018 IEEE Conference on Computer Vision and Pattern Recognition, June 2018, pp.3723-3732. DOI: 10.1109/CVPR.2018.00392.

  7. Baktashmotlagh M, Harandi M T, Lovell B C, Salzmann M. Unsupervised domain adaptation by domain invariant projection. In Proc. the 2013 IEEE International Conference on Computer Vision, December 2013, pp.769-776. DOI: 10.1109/ICCV.2013.100.

  8. Pan Y, Yao T, Li Y, Wang Y, Ngo C W, Mei T. Transferrable prototypical networks for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.2239-2247. DOI: 10.1109/CVPR.2019.00234.

  9. Lee C Y, Batra T, Baig M H, Ulbricht D. Sliced wasser-stein discrepancy for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.10285-10295. DOI: 10.1109/CVPR.2019.01053.

  10. Lee S, Kim D, Kim N, Jeong S G. Drop to adapt: Learning discriminative features for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF International Conference on Computer Vision, Oct. 27–Nov. 2, 2019, pp.91-100. DOI: 10.1109/ICCV.2019.00018.

  11. Pan S J, Tsang I W, Kwok J T, Yang Q. Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 2010, 22(2): 199-210. DOI: https://doi.org/10.1109/TNN.2010.2091281.

    Article  Google Scholar 

  12. Wang J, Chen Y, Hao S, Feng W, Shen Z. Balanced distribution adaptation for transfer learning. In Proc. the 2017 IEEE International Conference on Data Mining, November 2017, pp.1129-1134. DOI: 10.1109/ICDM.2017.150.

  13. Kononenko I. Machine learning for medical diagnosis: History, state of the art and perspective. Artificial Intelligence in Medicine, 2001, 23(1): 89-109. DOI: https://doi.org/10.1016/S0933-3657(01)00077-X.

    Article  Google Scholar 

  14. Chen F, Bruhadeshwar B, Liu A X. Cross-domain privacy-preserving cooperative firewall optimization. IEEE/ACM Transactions on Networking, 2012, 21(3): 857-868. DOI: https://doi.org/10.1109/TNET.2012.2217985.

    Article  Google Scholar 

  15. Lee T, Pappas C, Barrera D, Szalachowski P, Perrig A. Source accountability with domain-brokered privacy. In Proc. the 12th International Conference on Emerging Net-working Experiments and Technologies, December 2016, pp.345-358. DOI: 10.1145/2999572.2999581.

  16. Zhang L, Zhang D. Domain adaptation extreme learning machines for drift compensation in E-nose systems. IEEE Transactions on Instrumentation and Measurement, 2014, 64(7): 1790-1801. DOI: https://doi.org/10.1109/TIM.2014.2367775.

    Article  Google Scholar 

  17. Kim Y, Hong S, Cho D, Park H, Panda P. Domain adaptation without source data. arXiv:2007.01524, 2020. https://arxiv.org/abs/2007.01524v2, January 2021.

  18. Long M, Cao Z, Wang J, Jordan M I. Conditional adversarial domain adaptation. In Proc. the 32nd International Conference on Neural Information Processing Systems, December 2018, pp.1640-1650.

  19. Hu J, Mo Q, Liu Z et al. Multi-source classification: A DOA-based deep learning approach. In Proc. the 2020 International Conference on Computer Engineering and Application, March 2020, pp.463-467. DOI: 10.1109/IC-CEA50009.2020.00106.

  20. Zhao C, Wang S, Li D. Multi-source domain adaptation with joint learning for cross-domain sentiment classification. Knowledge-Based Systems, 2020, 191: Article No. 105254. DOI: 10.1016/j.knosys.2019.105254.

  21. Wang J, Feng W, Chen Y, Yu H, Huang M, Yu P S. Visual domain adaptation with manifold embedded distribution alignment. In Proc. the 26th ACM International Conference on Multimedia, October 2018, pp.402-410. DOI: 10.1145/3240508.3240512.

  22. Wang J, Chen Y, Feng W, Yu H, Huang M, Yang Q. Transfer learning with dynamic distribution adaptation. ACM Transactions on Intelligent Systems and Technology, 2020, 11(1): Article No. 6. DOI: 10.1145/3360309.

  23. Behrend R E, Pearce P A, Petkova V B, Zuber J B. On the classification of bulk and boundary conformal field theories. Physics Letters B, 1998, 444(1/2): 163-166. DOI: https://doi.org/10.1016/S0370-2693(98)01374-4.

    Article  MathSciNet  Google Scholar 

  24. Wang Q, Breckon T. Unsupervised domain adaptation via structured prediction based selective pseudo-labeling. In Proc. the 34th AAAI Conference on Artificial Intelligence, February 2020, pp.6243-6250. DOI: 10.1609/aaai.v34i04.6091.

  25. Kundu J N, Venkat N, Babu R V et al. Universal source-free domain adaptation. In Proc. the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2020, pp.4544-4553. DOI: 10.1109/CVPR42600.2020.00460.

  26. Nelakurthi A R, Maciejewski R, He J. Source free domain adaptation using an off-the-shelf classifier. In Proc. the 2018 IEEE International Conference on Big Data, December 2018, pp.140-145. DOI: 10.1109/BigData.2018.8622112.

  27. Duan L, Tsang I W, Xu D, Chua T S. Domain adaptation from multiple sources via auxiliary classifiers. In Proc. the 26th International Conference on Machine Learning, June 2009, pp.289-296. DOI: 10.1145/1553374.1553411.

  28. Zhang Y, Tang H, Jia K, Tan M. Domain-symmetric networks for adversarial domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.5031-5040. DOI: 10.1109/CVPR.2019.00517.

  29. Dziugaite G K, Roy D M, Ghahramani Z. Training generative neural networks via maximum mean discrepancy optimization. arXiv:1505.03906, 2015. https://arxiv.org/abs/1505.03906, January 2021.

  30. Iyer A, Nath S, Sarawagi S. Maximum mean discrepancy for class ratio estimation: Convergence bounds and kernel selection. In Proc. the 31st International Conference on Machine Learning, June 2014, pp.530-538.

  31. Li J, Zhao J, Lu K. Joint feature selection and structure preservation for domain adaptation. In Proc. the 25th International Joint Conference on Artificial Intelligence, July 2016, pp.1697-1703.

  32. Chen Y, Song S, Li S,Wu C. A graph embedding framework for maximum mean discrepancy-based domain adaptation algorithms. IEEE Transactions on Image Processing, 2019, 29: 199-213. DOI: https://doi.org/10.1109/TIP.2019.2928630.

    Article  MathSciNet  Google Scholar 

  33. Donmez P, Carbonell J G, Schneider J. Efficiently learning the accuracy of labeling sources for selective sampling. In Proc. the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, June 2009, pp.259-268. DOI: 10.1145/1557019.1557053.

  34. Liu A, Ziebart B. Robust classification under sample selection bias. In Proc. the 27th International Conference on Neural Information Processing Systems, December 2014, pp.37-45.

  35. Huang S J, Li G X, Huang W Y, Li S Y. Incremental multi-label learning with active queries. Journal of Computer Science and Technology, 2020, 35(2): 234-246. DOI: https://doi.org/10.1007/s11390-020-9994-3.

    Article  Google Scholar 

  36. Gao N, Huang S J, Yan Y, Chen S. Cross modal similarity learning with active queries. Pattern Recognition, 2018, 75: 214-222. DOI: https://doi.org/10.1016/j.patcog.2017.05.011.

    Article  Google Scholar 

  37. Gong B, Shi Y, Sha F, Grauman K. Geodesic ow kernel for unsupervised domain adaptation. In Proc. the 2012 IEEE Conference on Computer Vision and Pattern Recognition, June 2012, pp.2066-2073. DOI: 10.1109/CVPR.2012.6247911.

  38. Long M, Wang J, Ding G, Sun J, Yu P S. Transfer feature learning with joint distribution adaptation. In Proc. the 2013 IEEE International Conference on Computer Vision, December 2013, pp.2200-2207. DOI: 10.1109/ICCV.2013.274.

  39. Wang J, Chen Y, Yu H, Huang M, Yang Q. Easy transfer learning by exploiting intra-domain structures. In Proc. the 2019 IEEE International Conference on Multimedia and Expo, July 2019, pp.1210-1215. DOI: 10.1109/ICME.2019.00211.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qing Tian.

Supplementary Information

ESM 1

(PDF 278 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tian, Q., Ma, C., Zhang, FY. et al. Source-Free Unsupervised Domain Adaptation with Sample Transport Learning. J. Comput. Sci. Technol. 36, 606–616 (2021). https://doi.org/10.1007/s11390-021-1106-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-021-1106-5

Keywords

Navigation