Skip to main content

Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation

  • Conference paper
  • First Online:
Book cover Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11303))

Included in the following conference series:

  • 2264 Accesses

Abstract

Multi-label dimensionality reduction is an appealing and challenging task in data mining and machine learning. Previous works on multi-label dimensionality reduction mainly conduct in an unsupervised or supervised way, and ignore abundant unlabeled samples. In addition, most of them emphasize on using pairwise correlations between samples, therefore, unable to utilize the high-order sample information to improve the performance. To address these challenges, we propose an approach called Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation (SMLD-LRR). SMLD-LRR first utilizes the low rank representation in the feature space of samples to calculate the low rank constrained coefficient matrix, then it adapts the coefficient matrix to capture the high-order structure of samples. Next, it uses low rank representation in the label space of labeled samples to explore the global correlations of labels. After that, SMLD-LRR further employs the learned high-order structure of samples to enforce the consistency between samples in the original space and the corresponding samples in the projected subspace by maximizing the dependence between them. Finally, these two high-order correlations and the dependence term are incorporated into the multi-label linear discriminant analysis for dimensionality reduction. Extensive experimental results on four multi-label datasets demonstrate that SMLD-LRR achieves better performance than other competitive methods across various evaluation criteria; it also can effectively exploit high-order label correlations to preserve sample structure in the projected subspace.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Available at http://mulan.sourceforge.net/datasets-mlc.html.

References

  1. Zhang, M., Zhou, Z.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)

    Article  Google Scholar 

  2. Wu, X.Z., Zhou, Z.H.: A unified view of multi-label performance measures. In: ICML, pp. 3780–3788. IMIS, Sydney (2017)

    Google Scholar 

  3. Guo, B., Hou, C., Nie, F., Yi, D.: Semi-supervised multi-label dimensionality reduction. In: IEEE ICDM, pp. 919–924. IEEE, Barcelona (2016)

    Google Scholar 

  4. Nie, F., Xu, D., Li, X., Xiang, S.: Semisupervised dimensionality reduction and classification through virtual label regression. SMC Man Cybern. Part B (Cybern.) 41(3), 675–685 (2011)

    Article  Google Scholar 

  5. Jolliffe, I.: Principal Component Analysis. In: International Encyclopedia of Statistical Science, pp. 1094–1096. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  6. Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7, 2399–2434 (2006)

    MathSciNet  MATH  Google Scholar 

  7. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  8. Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  9. Hotelling, H.: Relations between two sets of variates. Biometrika 28(3–4), 321–377 (1936)

    Article  Google Scholar 

  10. Ji, S., Tang, L., Yu, S., Ye, J.: A shared-subspace learning framework for multi-label classification. TKDD 4(2), 8 (2010)

    Article  Google Scholar 

  11. Zhang, Y., Zhou, Z.: Multilabel dimensionality reduction via dependence maximization. TKDD 4(3), 14 (2010)

    Article  Google Scholar 

  12. Zhang, Z., Chow, T.W.: Robust linearly optimized discriminant analysis. Neurocomputing 79, 140–157 (2012)

    Article  Google Scholar 

  13. Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.: Measuring statistical dependence with Hilbert-Schmidt norms. In: Jain, S., Simon, H.U., Tomita, E. (eds.) ALT 2005. LNCS (LNAI), vol. 3734, pp. 63–77. Springer, Heidelberg (2005). https://doi.org/10.1007/11564089_7

    Chapter  Google Scholar 

  14. Wang, H., Ding, C., Huang, H.: Multi-label linear discriminant analysis. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6316, pp. 126–139. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15567-3_10

    Chapter  Google Scholar 

  15. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Hum. Genet. 7(2), 179–188 (1936)

    Google Scholar 

  16. Sun, L., Ji, S., Yu, S., Ye, J.: On the equivalence between canonical correlation analysis and orthonormalized partial least squares. In: IJCAI, Padadena, pp. 1230–1235 (2009)

    Google Scholar 

  17. Yu, G., Zhang, G., Domeniconi, C., Yu, Z., You, J.: Semi-supervised classification based on random subspace dimensionality reduction. Pattern Recognit. 45(3), 1119–1135 (2012)

    Article  Google Scholar 

  18. Wu, H., Prasad, S.: Semi-supervised dimensionality reduction of hyperspectral imagery using pseudo-labels. Pattern Recognit. 74, 212–224 (2018)

    Article  Google Scholar 

  19. Yuan, Y., Zhao, K., Lu, H.: Multi-label linear discriminant analysis with locality consistency. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds.) ICONIP 2014. LNCS, vol. 8835, pp. 386–394. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-12640-1_47

    Chapter  Google Scholar 

  20. Yu, Y., Yu, G., Chen, X., Ren, Y.: Semi-supervised multi-label linear discriminant analysis. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, E.S. (eds.) ICONIP 2017. LNCS, vol. 10634, pp. 688–698. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70087-8_71

    Chapter  Google Scholar 

  21. Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. TPAMI 35(1), 171–184 (2013)

    Article  Google Scholar 

  22. Yang, S., Wang, X., Wang, M., Han, Y., Jiao, L.: Semi-supervised low-rank representation graph for pattern recognition. IEEE Trans. Image Process. 7(2), 131–136 (2013)

    Article  MathSciNet  Google Scholar 

  23. Zhuang, L., Wang, J., Lin, Z., Yang, A.Y., Ma, Y., Yu, N.: Locality-preserving low-rank representation for graph construction from nonlinear manifolds. Neurocomputing 175, 715–722 (2016)

    Article  Google Scholar 

  24. Wen, J., Zhang, B., Xu, Y., Yang, J., Han, N.: Adaptive weighted nonnegative low-rank representation. Pattern Recognit. 81, 326–340 (2018)

    Article  Google Scholar 

  25. Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low-rank representation. In: NIPS, pp. 612–620. MIT Press, Spain (2011)

    Google Scholar 

  26. Zhang, H., Lin, Z., Zhang, C.: A counterexample for the validity of using nuclear norm as a convex surrogate of rank. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds.) ECML PKDD 2013. LNCS (LNAI), vol. 8189, pp. 226–241. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40991-2_15

    Chapter  Google Scholar 

  27. Wang, C., Yan, S., Zhang, L., Zhang, H.: Multi-label sparse coding for automatic image annotation. In: IEEE CVPR, pp. 1643–1650. IEEE, Miami Beach (2009)

    Google Scholar 

  28. Wright, J., Ma, Y., Mairal, J., Sapiro, G., Huang, T.S., Yan, S.: Sparse representation for computer vision and pattern recognition. Proc. IEEE 98(6), 1031–1044 (2010)

    Article  Google Scholar 

  29. Chung, F.R.: Spectral Graph Theory. American Mathematical Soc (No. 92) (1997)

    Google Scholar 

  30. Wang, F., Zhang, C.: Label propagation through linear neighborhoods. TKDE 20(1), 55–67 (2007)

    Google Scholar 

  31. Datta, R., Joshi, D., Li, J., Wang, J.Z.: Image retrieval: ideas, influences, and trends of the new age. ACM Comput. Surv. (CSUR) 40(2), 5 (2008)

    Article  Google Scholar 

  32. Zhang, M., Zhou, Z.: ML-kNN: a lazy learning approach to multi-label learning. Pattern Recognit. 40(7), 2038–2048 (2007)

    Article  Google Scholar 

  33. Bucak, S.S., Jin, R., Jain, A.K.: Multi-label learning with incomplete class assignments. In: CVPR, pp. 2801–2808. IEEE, Colorado, Colorado Springs (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yezi Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y. (2018). Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11303. Springer, Cham. https://doi.org/10.1007/978-3-030-04182-3_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04182-3_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04181-6

  • Online ISBN: 978-3-030-04182-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics