Skip to main content

Advertisement

Log in

Ordinal unsupervised multi-target domain adaptation with implicit and explicit knowledge exploitation

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

As an emerging research topic in the field of machine learning, unsupervised domain adaptation (UDA) aims to transfer prior knowledge from the source domain to help training the unsupervised target domain model. Although a variety of UDA works have been proposed, they mainly concentrate on scenarios from one source to one target (1S1T) or multi-source to one target domain (mS1T), the works on UDA from one source to multi-target (1SmT) is rare and they are mainly designed for ordinary problems. When countered with ordinal 1SmT tasks where there exists order relationship among the data labels, the existing methods degenerate in performance since the label relationships are not preserved. In this article, we propose an ordinal 1SmT UDA model which transfers both explicit and implicit knowledge from the supervised source and unsupervised target domains respectively via distribution alignment and dictionary transmission. We also design an efficient algorithm to solve the model and evaluate its convergence and complexity. Finally, the effectiveness of the proposed method is evaluated with extensive experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. Since the \(l_{2,1}\)-norm on \(\varvec{V}_T^m\) is convex [46], therefore (18) is entirely convex.

References

  1. Zhao M, Zhan C, Wu Z, Tang P (2015) Semi-supervised image classification based on local and global regression. IEEE Signal Process Lett 22(10):1666–1670

    Article  Google Scholar 

  2. Zhao MB, Chow TWS, Peng T, Wang Z, Zukerman M (2016) Route selection for cabling considering cost minimization and earthquake survivability via a semi-supervised probabilistic model. IEEE Trans Industr Inf 13(2):1–1

    Google Scholar 

  3. Gong B, Shi Y, Sha F, Grauman K (2012) Geodesic flow kernel for unsupervised domain adaptation. In: 2012 IEEE conference on computer vision and pattern recognition, pp. 2066–2073

  4. Zhuang F, Luo P, Du C, He Q, Shi Z, Xiong H (2013) Triplex transfer learning: exploiting both shared and distinct concepts for text classification. IEEE Trans Cybern 44(7):1191–1203

    Article  Google Scholar 

  5. Long M, Zhu H, Wang J, Jordan MI (2016) Unsupervised domain adaptation with residual transfer networks. In: Proceedings of the 30th international conference on neural information processing systems, pp. 136–144

  6. Tahmoresnezhad J, Hashemi S (2016) Visual domain adaptation via transfer feature learning. Knowl Inf Syst 50(2):1–21

    Google Scholar 

  7. Zhang L, Zhang D (2016) Robust visual knowledge transfer via extreme learning machine-based domain adaptation. IEEE Trans Image Process 25(10):4959–4973

    Article  MathSciNet  Google Scholar 

  8. Liu J, Zhang L (2019) Optimal projection guided transfer hashing for image retrieval. In: Proceedings of the AAAI conference on artificial intelligence, vol. 33, pp. 8754–8761

  9. Liang J, Hu D, Feng J (2020) Do we really need to access the source data? Source hypothesis transfer for unsupervised domain adaptation. In: International conference on machine learning, pp. 6028–6039

  10. Tian Q, Sun H, Ma C, Cao M, Chu Y, Chen S (2021) Heterogeneous domain adaptation with structure and classification space alignment. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2021.3070545

    Article  Google Scholar 

  11. Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359

    Article  Google Scholar 

  12. Cortes C, Mohri M, Riley M, Rostamizadeh A (2008) Sample selection bias correction theory. In: International conference on algorithmic learning theory, pp. 38–53

  13. Yao Y, Doretto G (2010) Boosting for transfer learning with multiple sources. In: 2010 IEEE computer society conference on computer vision and pattern recognition, pp. 1855–1862

  14. Tan B, Song Y, Zhong E, Yang Q (2015) Transitive transfer learning. In: Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, pp. 1155–1164

  15. Khan MNA, Heisterkamp DR (2016) Adapting instance weights for unsupervised domain adaptation using quadratic mutual information and subspace learning. In: 2016 23rd international conference on pattern recognition (ICPR), pp. 1560–1565

  16. Tan B, Zhang Y, Pan SJ, Yang Q (2017) Distant domain transfer learning. In: Thirty-first AAAI conference on artificial intelligence, pp. 2604–2610

  17. Long M, Wang J, Sun J, Philip SY (2014) Domain invariant transfer kernel learning. IEEE Trans Knowl Data Eng 27(6):1519–1532

    Article  Google Scholar 

  18. Tian Q, Chen S (2017) Cross-heterogeneous-database age estimation through correlation representation learning. Neurocomputing 238:286–295

    Article  Google Scholar 

  19. Li J, Lu K, Huang Z, Zhu L, Shen H (2019) Heterogeneous domain adaptation through progressive alignment. IEEE Trans Neural Netw Learn Syst 30(5):1381

    Article  MathSciNet  Google Scholar 

  20. Zhang L, Wang S, Huang G-B, Zuo W, Yang J, Zhang D (2019) Manifold criterion guided transfer learning via intermediate domain generation. IEEE Trans Neural Netw Learn Syst 30(12):3759–3773

    Article  MathSciNet  Google Scholar 

  21. Tian L, Tang Y, Hu L, Ren Z, Zhang W (2020) Domain adaptation by class centroid matching and local manifold self-learning. IEEE Trans Image Process 29:9703–9718

    Article  MathSciNet  Google Scholar 

  22. Wang W, Chen S, Xiang Y, Sun J, Li H, Wang Z, Sun F, Ding Z, Li B (2021) Sparsely-labeled source assisted domain adaptation. Pattern Recogn 112:107803

    Article  Google Scholar 

  23. Zhao Z, Chen Y, Liu J, Liu M (2010) Cross-mobile elm based activity recognition. Int J Eng Ind 1(1):30–38

    Google Scholar 

  24. Zhao Z, Chen Y, Liu J, Shen Z, Liu M (2011) Cross-people mobile-phone based activity recognition. In: Twenty-second international joint conference on artificial intelligence, pp. 2545–2550

  25. Sun S, Xu Z, Yang M (2013) Transfer learning with part-based ensembles. In: International workshop on multiple classifier systems, pp. 271–282

  26. Wei Y, Zhu Y, Leung CW-k, Song Y, Yang Q (2016) Instilling social to physical: co-regularized heterogeneous transfer learning. In: Thirtieth AAAI conference on artificial intelligence, pp. 1338–1344

  27. Yu H, Chen S (2019) Whole unsupervised domain adaptation using sparse representation of parameter dictionary. J Front Comput Sci Technol 13(05):822–833

    Google Scholar 

  28. Sugiyama M, Nakajima S, Kashima H, Buenau P, Kawanabe M (2007) Direct importance estimation with model selection and its application to covariate shift adaptation. In: NIPS'07: Proceedings of the 20th international conference on neural information processing systems, pp 1433–1440

  29. Long M, Cao Y, Wang J, Jordan M (2015) Learning transferable features with deep adaptation networks. In: International conference on machine learning, pp. 97–105. PMLR

  30. Pan SJ, Tsang IW, Kwok JT, Yang Q (2010) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210

    Article  Google Scholar 

  31. Sun B, Feng J, Saenko K (2016) Return of frustratingly easy domain adaptation. In: Proceedings of the AAAI conference on artificial intelligence, vol. 30

  32. Zellinger W, Grubinger T, Lughofer E, Natschläger T, Saminger-Platz S (2017) Central moment discrepancy (cmd) for domain-invariant representation learning. arXiv preprint arXiv:1702.08811

  33. Mancini M, Porzi L, Bulo SR, Caputo B, Ricci E (2018) Boosting domain adaptation by discovering latent domains. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3771–3780

  34. Caseiro R, Henriques JF, Martins P, Batista J (2015) Beyond the shortest path: Unsupervised domain adaptation by sampling subspaces along the spline flow. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3846–3854

  35. Peng P, Xiang T, Wang Y, Pontil M, Gong S, Huang T, Tian Y (2016) Unsupervised cross-dataset transfer learning for person re-identification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1306–1315

  36. Zhao H, Zhang S, Wu G, Moura JM, Costeira JP, Gordon GJ (2018) Adversarial multiple source domain adaptation. Adv Neural Inf Process Syst 31

  37. Dai Y, Liu J, Ren X, Xu Z (2020) Adversarial training based multi-source unsupervised domain adaptation for sentiment analysis. In: Proceedings of the AAAI conference on artificial intelligence, vol. 34, pp. 7618–7625

  38. Yang L, Balaji Y, Lim S-N, Shrivastava A (2020) Curriculum manager for source selection in multi-source domain adaptation. In: European conference on computer vision. Springer, New York, pp. 608–624

  39. Guo H, Pasunuru R, Bansal M (2020) Multi-source domain adaptation for text classification via distancenet-bandits. In: Proceedings of the AAAI conference on artificial intelligence, vol. 34, pp. 7830–7838

  40. Wang H, Xu M, Ni B, Zhang W (2020) Learning to combine: knowledge aggregation for multi-source domain adaptation. In: European conference on computer vision. Springer, New York, pp. 727–744

  41. Sun B-Y, Li J, Wu DD, Zhang X-M, Li W-B (2009) Kernel discriminant learning for ordinal regression. IEEE Trans Knowl Data Eng 22(6):906–910

    Article  Google Scholar 

  42. Sun B-Y, Wang H-L, Li W-B, Wang H-J, Li J, Du Z-Q (2015) Constructing and combining orthogonal projection vectors for ordinal regression. Neural Process Lett 41(1):139–155

    Article  Google Scholar 

  43. Pan SJ, Tsang IW, Kwok JT, Yang Q (2010) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210

    Article  Google Scholar 

  44. Wang J, Feng W, Chen Y, Yu H, Huang M, Yu PS (2018) Visual domain adaptation with manifold embedded distribution alignment. In: Proceedings of the 26th ACM international conference on multimedia, pp. 402–410

  45. Ben-David S, Blitzer J, Crammer K, Pereira F (2007) Analysis of representations for domain adaptation. In: Advances in neural information processing systems, pp. 137–144

  46. Nie F, Huang H, Cai X, Ding CH (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: Advances in neural information processing systems, pp. 1813–1821

  47. De Campos TE, Babu BR, Varma M et al (2009) Character recognition in natural images. VISAPP 2(7):273–280

    Google Scholar 

  48. Moschoglou S, Papaioannou A, Sagonas C, Deng J, Kotsia I, Zafeiriou S (2017) Agedb: the first manually collected, in-the-wild age database. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp. 51–59

  49. Ricanek K, Tesafaye T (2006) Morph: A longitudinal image database of normal adult age-progression. In: 7th international conference on automatic face and gesture recognition (FGR06), pp. 341–345

  50. Chen B-C, Chen C-S, Hsu WH (2014) Cross-age reference coding for age-invariant face recognition and retrieval. In: European conference on computer vision, pp. 768–783

  51. Dai W, Yang Q, Xue G-R, Yu Y (2008) Self-taught clustering. In: Proceedings of the 25th international conference on machine learning, pp. 200–207

  52. Jiang W, Chung F-l (2012) Transfer spectral clustering. In: Joint European conference on machine learning and knowledge discovery in databases, pp. 789–803

  53. Deng Z, Jiang Y, Chung F-L, Ishibuchi H, Choi K-S, Wang S (2015) Transfer prototype-based fuzzy clustering. IEEE Trans Fuzzy Syst 24(5):1210–1232

    Article  Google Scholar 

  54. Demisar J, Schuurmans D (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(1):1–30

    MathSciNet  Google Scholar 

  55. Zhao M, Zhang Y, Zhang Z, Liu J, Kong W (2019) Alg: adaptive low-rank graph regularization for scalable semi-supervised and unsupervised learning. Neurocomputing 370:16–27

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China under Grant 62176128, the Open Projects Program of State Key Laboratory for Novel Software Technology of Nanjing University under Grant KFKT2022B06, the Fundamental Research Funds for the Central Universities No. NJ2022028, the Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD) fund, as well as the Qing Lan Project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qing Tian.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tian, Q., Sun, H., Chu, Y. et al. Ordinal unsupervised multi-target domain adaptation with implicit and explicit knowledge exploitation. Int. J. Mach. Learn. & Cyber. 13, 3807–3820 (2022). https://doi.org/10.1007/s13042-022-01626-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-022-01626-3

Keywords