ABSTRACT
Multi-task Learning (MTL) aims to learn multiple related tasks simultaneously instead of separately to improve generalization performance of each task. Most existing MTL methods assumed that the multiple tasks to be learned have the same feature representation. However, this assumption may not hold for many real-world applications. In this paper, we study the problem of MTL with heterogeneous features for each task. To address this problem, we first construct an integrated graph of a set of bipartite graphs to build a connection among different tasks. We then propose a multi-task nonnegative matrix factorization (MTNMF) method to learn a common semantic feature space underlying different heterogeneous feature spaces of each task. Finally, based on the common semantic features and original heterogeneous features, we model the heterogenous MTL problem as a multi-task multi-view learning (MTMVL) problem. In this way, a number of existing MTMVL methods can be applied to solve the problem effectively. Extensive experiments on three real-world problems demonstrate the effectiveness of our proposed method.
- R. K. Ando and T. Zhang. A framework for learning predictive structures from multiple tasks and unlabeled data. JMLR, 6:01, 2005. Google ScholarDigital Library
- A. Argyriou, T. Evgeniou, and M. Pontil. Multi-task feature learning. In NIPS, pages 41--48, Vancouver, BC, Canada, 2007.Google ScholarDigital Library
- S. Bickel. Ecml-pkdd discovery challenge 2006 overview. In ECML-PKDD Discovery Challenge Workshop, pages 1--9, 2006.Google Scholar
- J. Blitzer, M. Dredze, and F. Pereira. Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification. In ACL, pages 440--447, Prague, Czech republic, 2007.Google Scholar
- R. Caruana. Multitask learning. Machine learning, 28(1):41--75, 1997. Google ScholarDigital Library
- J. Chen, L. Tang, J. Liu, and J. Ye. A convex formulation for learning shared structures from multiple tasks. In ICML, pages 137--144, Montreal, QC, Canada, 2009. Google ScholarDigital Library
- J. Chen, J. Zhou, and J. Ye. Integrating low-rank and group-sparse structures for robust multi-task learning. In ACM SIGKDD, pages 42--50, San Diego, CA, United states, 2011. Google ScholarDigital Library
- W. Dai, Y. Chen, G.-R. Xue, Q. Yang, and Y. Yu. Translated learning: Transfer learning across different feature spaces. In NIPS, pages 353--360, 2008.Google ScholarDigital Library
- C. Ding, T. Li, W. Peng, and H. Park. Orthogonal nonnegative matrix t-factorizations for clustering. In ACM SIGKDD, pages 126--135. ACM, 2006. Google ScholarDigital Library
- L. Duan, D. Xu, and I. W. Tsang. Learning with augmented features for heterogeneous domain adaptation. In ICML, 2012.Google ScholarDigital Library
- T. Evgeniou and M. Pontil. Regularized multi-task learning. In ACM SIGKDD, pages 109--117, Seattle, WA, United states, 2004. Google ScholarDigital Library
- P. Gong, J. Ye, and C. Zhang. Robust multi-task feature learning. In ACM SIGKDD, pages 895--903. ACM, 2012. Google ScholarDigital Library
- J. He and R. Lawrence. A graph-based framework for multi-task multi-view learning. In ICML, pages 25--32, Bellevue, WA, United states, 2011.Google ScholarDigital Library
- J. He, Y. Liu, and Q. Yang. Linking heterogeneous input spaces with pivots for multi-task learning. In SDM, 2014.Google ScholarCross Ref
- A. Jalali, P. Ravikumar, S. Sanghavi, and C. Ruan. A dirty model for multi-task learning. In NIPS, Vancouver, BC, Canada, 2010.Google Scholar
- X. Jin, F. Zhuang, S. Wang, Q. He, and Z. Shi. Shared structure learning for multiple tasks with multiple views. In ECML PKDD, volume 8189 LNAI, pages 353--368, Prague, Czech republic, 2013.Google ScholarDigital Library
- X. Jin, F. Zhuang, H. Xiong, C. Du, P. Luo, and Q. He. Multi-task multi-view learning for heterogeneous tasks. In CIKM, pages 441--450, New York, NY, USA, 2014. ACM. Google ScholarDigital Library
- T. Joachims. Transductive inference for text classification using support vector machines. In ICML, pages 200--209, San Francisco, CA, USA, 1999. Google ScholarDigital Library
- Z. Kang, K. Grauman, and F. Sha. Learning with whom to share in multi-task feature learning. In ICML, pages 521--528, 2011.Google ScholarDigital Library
- G.-J. Qi, C. Aggarwal, and T. Huang. Towards semantic knowledge propagation from text corpus to web images. In WWW, pages 297--306. ACM, 2011. Google ScholarDigital Library
- G. Skolidis and G. Sanguinetti. Bayesian multitask classification with gaussian process priors. IEEE Transactions on Neural Networks, 22(12):2011--2021, 2011. Google ScholarDigital Library
- S. Yu, V. Tresp, and K. Yu. Robust multi-task learning with t-processes. In ICML, volume 227, pages 1103--1110, Corvalis, OR, United states, 2007. Google ScholarDigital Library
- J. Zhang and J. Huan. Inductive multi-task learning with multiple view data. In ACM SIGKDD, pages 543--551, Beijing, China, 2012. Google ScholarDigital Library
- Y. Zhang and D.-Y. Yeung. Multi-task learning in heterogeneous feature spaces. In AAAI, volume 1, pages 574--579, San Francisco, CA, United states, 2011.Google ScholarDigital Library
- Y. Zhu, Y. Chen, Z. Lu, S. J. Pan, G.-R. Xue, Y. Yu, and Q. Yang. Heterogeneous transfer learning for image classification. In AAAI, 2011.Google ScholarDigital Library
Index Terms
- Heterogeneous Multi-task Semantic Feature Learning for Classification
Recommendations
Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts
KDD '18: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data MiningNeural-based multi-task learning has been successfully used in many real-world large-scale applications such as recommendation systems. For example, in movie recommendations, beyond providing users movies which they tend to purchase and watch, the ...
Multi-task Multi-view Learning for Heterogeneous Tasks
CIKM '14: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge ManagementMulti-task multi-view learning deals with the learning scenarios where multiple tasks are associated with each other through multiple shared feature views. All previous works for this problem assume that the tasks use the same set of class labels. ...
Robust multi-task feature learning
KDD '12: Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data miningMulti-task learning (MTL) aims to improve the performance of multiple related tasks by exploiting the intrinsic relationships among them. Recently, multi-task feature learning algorithms have received increasing attention and they have been successfully ...
Comments