ABSTRACT
Open-set domain adaptation aims to improve the generalization performance of a learning algorithm on a target task of interest by leveraging the label information from a relevant source task with only a subset of classes. However, most existing works are designed for the static setting, and can be hardly extended to the dynamic setting commonly seen in many real-world applications. In this paper, we focus on the more realistic open-set domain adaptation setting with a static source task and a time evolving target task where novel unknown target classes appear over time. Specifically, we show that the classification error of the new target task can be tightly bounded in terms of positive-unlabeled classification errors for historical tasks and open-set domain discrepancy across tasks. By empirically minimizing the upper bound of the target error, we propose a novel positive-unlabeled learning based algorithm named OuterAdapter for dynamic open-set domain adaptation with time evolving unknown classes. Extensive experiments on various data sets demonstrate the effectiveness and efficiency of our proposed OuterAdapter algorithm over state-of-the-art domain adaptation baselines.
Supplemental Material
- Shai Ben-David, John Blitzer, Koby Crammer, Alex Kulesza, Fernando Pereira, and Jennifer Wortman Vaughan. 2010. A theory of learning from different domains. Machine Learning 79, 1--2 (2010), 151--175.Google ScholarDigital Library
- Andreea Bobu, Eric Tzeng, Judy Hoffman, and Trevor Darrell. 2018. Adapting to continuously shifting domains. (2018).Google Scholar
- Zhen Fang, Jie Lu, Feng Liu, Junyu Xuan, and Guangquan Zhang. 2020. Open set domain adaptation: Theoretical bound and algorithm. TNNLS (2020).Google Scholar
- Qianyu Feng, Guoliang Kang, Hehe Fan, and Yi Yang. 2019. Attract or distract: Exploit the margin of open set. In ICCV. 7990--7999.Google Scholar
- Ronald A Fisher. 1936. The use of multiple measurements in taxonomic problems. Annals of eugenics 7, 2 (1936), 179--188.Google ScholarCross Ref
- Yaroslav Ganin, Evgeniya Ustinova, Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand, and Victor Lempitsky. 2016. Domain-adversarial training of neural networks. JMLR 17, 1 (2016), 2096--2030.Google ScholarDigital Library
- Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In CVPR. 770--778.Google Scholar
- Judy Hoffman, Trevor Darrell, and Kate Saenko. 2014. Continuous manifold based adaptation for evolving visual domains. In CVPR. 867--874.Google Scholar
- Kurt Hornik, Maxwell Stinchcombe, and Halbert White. 1989. Multilayer feedforward networks are universal approximators. Neural Networks 2, 5 (1989), 359--366.Google ScholarCross Ref
- Andrew Kae and Yale Song. 2020. Image to Video Domain Adaptation Using Web Supervision. In WACV. 567--575.Google Scholar
- Ryuichi Kiryo, Gang Niu, Marthinus C Du Plessis, and Masashi Sugiyama. 2017. Positive-unlabeled learning with non-negative risk estimator. NeurIPS 30 (2017).Google Scholar
- Ananya Kumar, Tengyu Ma, and Percy Liang. 2020. Understanding self-training for gradual domain adaptation. In ICML. 5468--5479.Google Scholar
- Zhizhong Li and Derek Hoiem. 2017. Learning without forgetting. TPAMI 40, 12 (2017), 2935--2947.Google ScholarDigital Library
- Hong Liu, Zhangjie Cao, Mingsheng Long, Jianmin Wang, and Qiang Yang. 2019. Separate to adapt: Open set domain adaptation via progressive separation. In CVPR. 2927--2936.Google Scholar
- Hong Liu, Mingsheng Long, Jianmin Wang, and Yu Wang. 2020. Learning to Adapt to Evolving Domains. NeurIPS 33 (2020).Google Scholar
- Mohammad Reza Loghmani, Markus Vincze, and Tatiana Tommasi. 2020. PositiveUnlabeled Learning for Open Set Domain Adaptation. Pattern Recognition Letters (2020).Google Scholar
- Yadan Luo, Zijian Wang, Zi Huang, and Mahsa Baktashmotlagh. 2020. Progressive graph learning for open-set domain adaptation. In ICML. 6468--6478.Google Scholar
- Yishay Mansour, Mehryar Mohri, and Afshin Rostamizadeh. 2009. Domain adaptation: Learning bounds and algorithms. In COLT.Google Scholar
- Pau Panareda Busto and Juergen Gall. 2017. Open set domain adaptation. In ICCV. 754--763.Google Scholar
- Xingchao Peng, Ben Usman, Kuniaki Saito, Neela Kaushik, Judy Hoffman, and Kate Saenko. 2018. Syn2real: A new benchmark forsynthetic-to-real visual domain adaptation. arXiv preprint arXiv:1806.09755 (2018).Google Scholar
- Kate Saenko, Brian Kulis, Mario Fritz, and Trevor Darrell. 2010. Adapting visual category models to new domains. In ECCV. 213--226.Google Scholar
- Kuniaki Saito, Shohei Yamamoto, Yoshitaka Ushiku, and Tatsuya Harada. 2018. Open set domain adaptation by backpropagation. In ECCV. 153--168.Google Scholar
- Tasfia Shermin, Guojun Lu, Shyh Wei Teng, Manzur Murshed, and Ferdous Sohel. 2020. Adversarial network with multiple classifiers for open set domain adaptation. IEEE Transactions on Multimedia (2020).Google Scholar
- Hemanth Venkateswara, Jose Eusebio, Shayok Chakraborty, and Sethuraman Panchanathan. 2017. Deep hashing network for unsupervised domain adaptation. In CVPR. 5018--5027.Google Scholar
- Aparna Nurani Venkitasubramanian, Tinne Tuytelaars, and Marie-Francine Moens. 2016. Wildlife recognition in nature documentaries with weak supervision from subtitles and external data. Pattern Recognition Letters (2016).Google Scholar
- Hao Wang, Hao He, and Dina Katabi. 2020. Continuously Indexed Domain Adaptation. In ICML. 9898--9907.Google Scholar
- Zirui Wang, Zihang Dai, Barnabás Póczos, and Jaime Carbonell. 2019. Characterizing and avoiding negative transfer. In CVPR. 11293--11302.Google Scholar
- Junfeng Wen, Russell Greiner, and Dale Schuurmans. 2020. Domain aggregation networks for multi-source domain adaptation. In ICML. 10214--10224.Google Scholar
- Jun Wu and Jingrui He. 2020. Continuous Transfer Learning with Label-informed Distribution Alignment. arXiv preprint arXiv:2006.03230 (2020).Google Scholar
- Jun Wu and Jingrui He. 2021. Indirect Invisible Poisoning Attacks on Domain Adaptation. In KDD. 1852--1862.Google Scholar
- Jun Wu and Jingrui He. 2022. A Unified Meta-Learning Framework for Dynamic Transfer Learning. In IJCAI.Google Scholar
- Man Wu, Shirui Pan, Chuan Zhou, Xiaojun Chang, and Xingquan Zhu. 2020. Unsupervised domain adaptive graph convolutional networks. In Proceedings of The Web Conference 2020. 1457--1467.Google ScholarDigital Library
- Dongbo Xi, Fuzhen Zhuang, Ganbin Zhou, Xiaohu Cheng, Fen Lin, and Qing He. 2020. Domain adaptation with category attention network for deep sentiment analysis. In WWW. 3133--3139.Google Scholar
- Yixing Xu, Chang Xu, Chao Xu, and Dacheng Tao. 2017. Multi-positive and unlabeled learning. In IJCAI. 3182--3188.Google Scholar
- Wei-Nan Zhang, Qingfu Zhu, Yifa Wang, Yanyan Zhao, and Ting Liu. 2019. Neural personalized response generation as domain adaptation. WWW (2019).Google Scholar
- Yuchen Zhang, Tianle Liu, Mingsheng Long, and Michael Jordan. 2019. Bridging Theory and Algorithm for Domain Adaptation. In ICML. 7404--7413.Google Scholar
- Han Zhao, Remi Tachet Des Combes, Kun Zhang, and Geoffrey Gordon. 2019. On Learning Invariant Representations for Domain Adaptation. In ICML. 7523--7532.Google Scholar
- Han Zhao, Shanghang Zhang, Guanhang Wu, José MF Moura, Joao P Costeira, and Geoffrey J Gordon. 2018. Adversarial multiple source domain adaptation. NeurIPS (2018), 8559--8570.Google Scholar
- Dawei Zhou, Lecheng Zheng, Yada Zhu, Jianbo Li, and Jingrui He. 2020. Domain adaptive multi-modality neural attention network for financial forecasting. In Proceedings of The Web Conference 2020. 2230--2240.Google ScholarDigital Library
- Yao Zhou, Lei Ying, and Jingrui He. 2017. MultiC2 : an Optimization Framework for Learning from Task and Worker Dual Heterogeneity. In SDM. 579--587.Google Scholar
- Yao Zhou, Lei Ying, and Jingrui He. 2019. Multi-task Crowdsourcing via an Optimization Framework. TKDD 13, 3 (2019), 27:1--27:26.Google Scholar
Index Terms
- Domain Adaptation with Dynamic Open-Set Targets
Recommendations
Domain Adaptation for Face Recognition: Targetize Source Domain Bridged by Common Subspace
In many applications, a face recognition model learned on a source domain but applied to a novel target domain degenerates even significantly due to the mismatch between the two domains. Aiming at learning a better face recognition model for the target ...
Mutual Domain Adaptation
Highlights- We tackle a realistic problem setting of domain adaptation, where most domains are label-deficient and need to be helped and recent data become more sparsely labeled which makes the learning even more difficult.
- To tackle this problem, ...
AbstractTo solve the label sparsity problem, domain adaptation has been well-established, suggesting various methods such as finding a common feature space of different domains using projection matrices or neural networks. Despite recent advances, domain ...
Zero-Shot Deep Domain Adaptation
Computer Vision – ECCV 2018AbstractDomain adaptation is an important tool to transfer knowledge about a task (e.g. classification) learned in a source domain to a second, or target domain. Current approaches assume that task-relevant target-domain data is available during training. ...
Comments