Skip to main content
Log in

Multi-label active learning by model guided distribution matching

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

Multi-label learning is an effective framework for learning with objects that have multiple semantic labels, and has been successfully applied into many real-world tasks. In contrast with traditional single-label learning, the cost of labeling a multi-label example is rather high, thus it becomes an important task to train an effectivemulti-label learning model with as few labeled examples as possible. Active learning, which actively selects the most valuable data to query their labels, is the most important approach to reduce labeling cost. In this paper, we propose a novel approach MADM for batch mode multi-label active learning. On one hand, MADM exploits representativeness and diversity in both the feature and label space by matching the distribution between labeled and unlabeled data. On the other hand, it tends to query predicted positive instances, which are expected to be more informative than negative ones. Experiments on benchmark datasets demonstrate that the proposed approach can reduce the labeling cost significantly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Zhang M L, Zhou Z H. A review on multi-label learning algorithms. IEEE Transactions on Knowledge and Data Engineering, 2014, 26(8): 1819–1837

    Article  Google Scholar 

  2. Yang Y, Wu F, Nie F, Fei Wu, Shen H T, Zhuang Y, Hauptmann A G. Web and personal image annotation by mining label correlation with relaxed visual graph embedding. IEEE Transactions on Image Processing, 2012, 21(3): 1339–1351

    Article  MathSciNet  Google Scholar 

  3. Lin W Z, Fang J A, Xiao X, Chou K C. iLoc-Animal: a multi-label learning classifier for predicting subcellular localization of animal proteins. Molecular BioSystems, 2013, 9(4): 634–644

    Article  Google Scholar 

  4. Settles B. Active learning literature survey. Madison: University of Wisconsin. Technical Report. 2010

    Google Scholar 

  5. Li X, Wang L, Sung E. Multilabel SVM active learning for image classification. In: Proceedings of the 21st IEEE International Conference on Image Processing. 2004, 2207–2210

    Google Scholar 

  6. Brinker K. On active learning in multi-label classification. In: Bühlmann P, Tellner D, Havemann S, et al., eds. From Data and Information Analysis to Knowledge Engineering. Springer Berlin Heidelberg, 2006, 206–213

    Google Scholar 

  7. Yang B, Sun J T, Wang T, Chen Z. Effective multi-label active learning for text classification. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2009, 917–926

    Chapter  Google Scholar 

  8. Vasisht D, Damianou A, Varma M, Kapoor A. Active learning for sparse bayesian multilabel classification. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2014, 472–481

    Google Scholar 

  9. Wu J, Sheng V S, Zhang J, Zhao P, Cui Z. Multi-label active learning for image classification. In: Proceedings of IEEE International Conference on Image Processing. 2014, 5227–5231

    Google Scholar 

  10. Zhao S, Wu J, Sheng V S, Ye C, Zhao P, Cui Z. Weak labeled multi-label active learning for image classification. In: Proceedings of the 23rd Annual ACM Conference on Multimedia Conference. 2015, 1127–1130

    Google Scholar 

  11. Li X, Guo Y. Active learning with multi-label svm classification. In: Proceedings of the 23rd International Joint Conference on Artificial Intelligence. 2013, 1479–1485

    Google Scholar 

  12. Huang S J, Zhou Z H. Active query driven by uncertainty and diversity for incremental multi-label learning. In: Proceeding of the 13th IEEE International Conference on Data Mining. 2013, 1079–1084

    Google Scholar 

  13. Huang S J, Jin R, Zhou Z H. Active learning by querying informative and representative examples. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(10): 1936–1949

    Article  Google Scholar 

  14. Li S Y, Jiang Y, Zhou Z H. Multi-label active learning from crowds. arXiv preprint arXiv:1508.00722, 2015

    Google Scholar 

  15. Guo Y, Schuurmans D. Discriminative batch mode active learning. In: Proceedings of Advances in Neural Information Processing Systems. 2008, 593–600

    Google Scholar 

  16. Yang Y, Ma Z, Nie F, Chang X, Hauptmann A G. Multi-class active learning by uncertainty sampling with diversity maximization. International Journal of Computer Vision, 2014, 113(2): 113–127

    Article  MathSciNet  Google Scholar 

  17. Long C, Hua G, Multi-class multi-annotator active learning with robust Gaussian Process for visual recognition, In: Proceedings of IEEE International Conference on Computer Vision. 2015

    Google Scholar 

  18. Xin J, Cui Z, Zhao P, He T. Active transfer learning of matching query results across multiple sources. Frontiers of Computer Science, 2015, 1–13

    Google Scholar 

  19. Hoi S C H, Jin R, Zhu J, Lyu M. Batch mode active learning and its application to medical image classification. In: Proceedings of the 23rd International Conference on Machine Learning. 2006, 417–424

    Google Scholar 

  20. Chattopadhyay R, Wang Z, Fan W, Ian D, Sethuraman P, Jieping Y. Batch mode active sampling based on marginal probability distribution matching. ACMTransactions on Knowledge Discovery from Data, 2013, 7(3): 965–991

    Google Scholar 

  21. Guo Y. Active instance sampling via matrix partition. In: Proceedings of Advances in Neural Information Processing Systems. 2010, 802–810

    Google Scholar 

  22. Hung C W, Lin H T. Multi-label active learning with auxiliary learner. In: Proceedings of the 3rd Asian Conference on Machine Learning. 2011, 315–332

    Google Scholar 

  23. Vapnik V N. The nature of statistical learning theory. In: Cowell R G, Dawid A P, Lauritzen S L, et al., eds. Statistics for Engineering and Information Science. New York: Springer-Verlag, 2000

    Google Scholar 

  24. Borgwardt K M, Gretton A, Rasch M, Kriegel H, Schölkopf B, Smola A. Integrating structured biological data by kernel maximum mean discrepancy. Bioinformatics, 2006, 22(14): 49–57

    Article  Google Scholar 

  25. Gretton A, Borgwardt K M, Rasch M, Kriegel H, Schölkopf B, Smola A. A kernel method for the two-sample-problem. In: Proceedings of Advances in Neural Information Processing Systems. 2006, 513–520

    Google Scholar 

  26. Huang J, Smola A, Gretton A, Borgwardt K M, Schölkopf B. Correcting sample selection bias by unlabeled data. In: Proceedings of Advances in Neural Information Processing Systems. 2006, 601–608

    Google Scholar 

  27. Pan S J, Tsang I W, Kwok J T, Yang Q. Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 2011, 22(2): 199–210

    Article  Google Scholar 

  28. Sriperumbudur B K, Gretton A, Fukumizu K, Schölkopf B,Lanckriet G. Hilbert space embeddings and metrics on probability measures. The Journal of Machine Learning Research, 2010, 11: 1517–1561

    MathSciNet  MATH  Google Scholar 

  29. Trohidis K, Tsoumakas G, Kalliris G, Vlahavas I. Multi-label classification of music into emotions. In: Proceedings of the 9th International Conference On Music Information Retrieval. 2008, 325–330

    Google Scholar 

  30. Sebastiani F. Machine learning in automated text categorization. ACM Computing Surveys, 2002, 34(2): 1–47

    Article  MathSciNet  Google Scholar 

  31. Boutell M R, Luo J, Shen X, Brown C M. Learning multi-label scene classification. Pattern Recognition, 2004, 37(9): 1757–1771

    Article  Google Scholar 

  32. Zhang ML and Zhou Z H. ML-kNN: a lazy learning approach to multilabel learning. Pattern Recognition, 2007, 40(7): 2038–2048

    Article  MATH  Google Scholar 

  33. Xu J. Fast multi-label core vector machine. Pattern Recognition, 2013, 46(3): 885–898

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sheng-Jun Huang.

Additional information

Nengneng Gao received his BS from Henan University, China in 2014. He received the National Endeavor Fellowship and the Outstanding Student of Henan University in 2011, 2012 and 2013, respectively. Currently he is a master student at the College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, China. His research interests include machine learning and pattern recognition.

Sheng-Jun Huang received his BS and PhD degrees in computer science from Nanjing University in 2008 and 2014, respectively. He received the Microsoft Fellowship Award in 2011 and the Best Poster award at the 18th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) in 2012. He joined PAttern Recognition and NEural Computing laboratory (PARNEC) as an assistant professor in 2014. His research interests include machine learning and pattern recognition.

Songcan Chen received the BS degree from Hangzhou University (now merged into Zhejiang University), China, the MS degree from Shanghai Jiao Tong University, China and the PhD degree from Nanjing University of Aeronautics and Astronautics (NUAA), China in 1983, 1985, and 1997, respectively. He joined in NUAA in 1986, and he has been a full-time Professor with the Department of Computer Science and Engineering since 1998. He has authored/coauthored over 170 scientific peer-reviewed papers and ever obtained Honorable Mentions of 2006, 2007 and 2010 Best Paper Awards of Pattern Recognition Journal respectively. His current research interests include pattern recognition, machine learning, and neural computing.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gao, N., Huang, SJ. & Chen, S. Multi-label active learning by model guided distribution matching. Front. Comput. Sci. 10, 845–855 (2016). https://doi.org/10.1007/s11704-016-5421-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11704-016-5421-x

Keywords

Navigation