Skip to main content
Log in

Joint Robust Multi-view Spectral Clustering

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Current multi-view clustering algorithms use multistage strategies to conduct clustering, or require cluster number or similarity matrix prior, or suffer influence of irrelevant features and outliers. In this paper, we propose a Joint Robust Multi-view (JRM) spectral clustering algorithm that considers information from all views of the multi-view dataset to conduct clustering and solves the issues, such as initialization, cluster number determination, similarity measure, feature selection, and outlier reduction around clustering, in a unified way. The optimal performance could be reached when all views are considered and the separated stages are combined in a unified way. Experiments have been performed on six real-world benchmark datasets and our proposed JRM algorithm outperforms the comparison clustering algorithms in terms of two evaluation metrics for clustering algorithms including accuracy and purity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Aljarah I et al (2020) Clustering analysis using a novel locality-informed grey wolf-inspired clustering approach. Knowl Inf Syst 62:507–539

  2. Zhang Z et al (2018) Binary multi-view clustering. IEEE Trans Pattern Anal Mach Intell 41(7):1774–1782

    Article  Google Scholar 

  3. Yu H et al (2019) An active three-way clustering method via low-rank matrices for multi-view data. Inf Sci 507:823–839

    Article  Google Scholar 

  4. Wang N et al (2019) Structured sparse multi-view feature selection based on weighted hinge loss. Multimed Tools Appl 78(11):15455–15481

    Article  Google Scholar 

  5. Li J et al (2018) Feature selection: a data perspective. ACM Comput Surv (CSUR) 50(6):94

    Article  Google Scholar 

  6. Sun S (2013) A survey of multi-view machine learning. Neural Comput Appl 23(7–8):2031–2038

    Article  Google Scholar 

  7. Yin Q et al (2015) Multi-view clustering via pairwise sparse subspace representation. Neurocomputing 156:12–21

    Article  Google Scholar 

  8. Wang H et al (2019) A study of graph-based system for multi-view clustering. Knowl-Based Syst 163:1009–1019

    Article  Google Scholar 

  9. Nie F, Tian L, Li X (2018) Multiview clustering via adaptively weighted procrustes. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. ACM, pp 2022–2030

  10. Brbić M, Kopriva I (2018) Multi-view low-rank sparse subspace clustering. Pattern Recognit 73:247–258

    Article  Google Scholar 

  11. Solorio-Fernández S, Carrasco-Ochoa JA, Martínez-Trinidad JF (2020) A review of unsupervised feature selection methods. Artif Intell Rev 53:907–948

  12. Nie F et al (2010) Efficient and robust feature selection via joint ℓ2, 1-norms minimization. In: Advances in neural information processing systems

  13. Nie F et al (2017) Auto-weighted multi-view learning for image clustering and semi-supervised classification. IEEE Trans Image Process 27(3):1501–1511

    Article  MathSciNet  Google Scholar 

  14. Zheng Q et al (2019) Feature concatenation multi-view subspace clustering. arXiv preprint arXiv:1901.10657

  15. Zhu X, Zhu Y, Zheng W (2019) Spectral Rotation for deep one-step clustering. Pattern Recognit. https://doi.org/10.1016/j.patcog.2019.107175

    Article  Google Scholar 

  16. Arshad H et al (2019) Multi-level features fusion and selection for human gait recognition: an optimized framework of Bayesian model and binomial distribution. Int J Mach Learn Cybern 10:3601–3618

  17. Kumar A, Rai P, Daume H (2011) Co-regularized multi-view spectral clustering. In: Advances in neural information processing systems, pp 1413–1421

  18. Cano A (2017) An ensemble approach to multi-view multi-instance learning. Knowl-Based Syst 136:46–57

    Article  Google Scholar 

  19. Zhu X et al (2018) One-step multi-view spectral clustering. IEEE Trans Knowl Data Eng 31(10):2022–2034

    Article  Google Scholar 

  20. Zhu X et al (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044

    Article  Google Scholar 

  21. Zheng W et al (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recognit Lett. https://doi.org/10.1016/j.patrec.2018.06.029

    Article  Google Scholar 

  22. Zhu X et al (2019) Low-rank sparse subspace for spectral clustering. IEEE Trans Knowl Data Eng 31(8):1532–1543

    Article  Google Scholar 

  23. Zhu X et al (2019) Efficient utilization of missing data in cost-sensitive learning. IEEE Trans Knowl Data Eng. https://doi.org/10.1109/TKDE.2019.2956530

    Article  Google Scholar 

  24. Zhang Z et al (2017) Robust neighborhood preserving projection by nuclear/L2, 1-norm regularization for image feature extraction. IEEE Trans Image Process 26(4):1607–1622

    Article  MathSciNet  Google Scholar 

  25. Zhao M et al (2018) Trace ratio criterion based discriminative feature selection via l2, p-norm regularization for supervised learning. Neurocomputing 321:1–16

    Article  Google Scholar 

  26. Zhu X et al (2019) Spectral clustering via half-quadratic optimization. World Wide Web. https://doi.org/10.1007/s11280-019-00731-8

    Article  Google Scholar 

  27. Hu R et al (2019) Robust SVM with adaptive graph learning. World Wide Web. https://doi.org/10.1007/s11280-019-00766-x

    Article  Google Scholar 

  28. Knox EM, Ng RT (1998) Algorithms for mining distance-based outliers in large datasets. In: Proceedings of the 24th international conference on very large data bases. Citeseer, 1998, pp 392–403

  29. Suri NR, Athithan G (2019) Research issues in outlier detection. In: Outlier detection: techniques and applications. Springer, pp 29–51

  30. Yu Y-F et al (2019) Joint transformation learning via the L2, 1-norm metric for robust graph matching. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2019.2912718

  31. Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Advances in neural information processing systems, 2007, pp 41–48

  32. Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient l 2, 1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence. AUAI Press. arXiv preprint arXiv:1205.2631

  33. Jiang B, Ding C (2017) Outlier regularization for vector data and L21 norm robustness. arXiv preprint arXiv:1706.06409

  34. Singh A, Yadav A, Rana A (2013) K-means with three different distance metrics. Int J Comput Appl 67(10):13–17

    Google Scholar 

  35. Doad PK, Mahip MB (2013) Survey on clustering algorithm & diagnosing unsupervised anomalies for network security. Int J Curr Eng Technol 3:2122–2125

  36. Nie F et al (2011) Unsupervised and semi-supervised learning via ℓ 1-norm graph. In: ICCV 2011, IEEE, pp 2268–2273

  37. Barron JT (2017) A more general robust loss function. arXiv preprint arXiv:1701.03077

  38. Geman S, McClure DE (1987) Statistical methods for tomographic image reconstruction. Bull Int Stat Inst 52(4):5–21

    MathSciNet  Google Scholar 

  39. Huber PJ (2011) Robust statistics. In: International Encyclopedia of Statistical Science. Springer

  40. Nikolova M, Chan RH (2007) The equivalence of half-quadratic minimization and the gradient linearization iteration. IEEE Trans Image Process 16(6):1623–1627

    Article  MathSciNet  Google Scholar 

  41. Black MJ, Rangarajan A (1996) On the unification of line processes, outlier rejection, and robust statistics with applications in early vision. Int J Comput Vis 19(1):57–91

    Article  Google Scholar 

  42. Zheng W et al (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755

    Article  Google Scholar 

  43. Lei C, Zhu X (2018) Unsupervised feature selection via local structure learning and sparse learning. Multimed Tools Appl 77(22):29605–29622

    Article  Google Scholar 

  44. Voloshinov VV (2018) A generalization of the Karush–Kuhn–Tucker theorem for approximate solutions of mathematical programming problems based on quadratic approximation. Comput Math Math Phys 58(3):364–377

    Article  MathSciNet  Google Scholar 

  45. Nilsback M-E, Zisserman A. 17 Category Flower Dataset, U.o. Oxford, Editor. Retrieved from http://www.robots.ox.ac.uk/~vgg/data/flowers/17/

  46. Fei-Fei L, Fergus R, Perona P (2006) One-shot learning of object categories. IEEE Trans Pattern Anal Mach Intell 28(4):594–611

    Article  Google Scholar 

  47. Grimal C. WebKB, U.o.C. LINQS, Santa Cruz, Editor. Retrieved from https://linqs.soe.ucsc.edu/

  48. Greene D. 3-sources, U.C. Dublin, Editor. Retrieved from http://mlg.ucd.ie/datasets/3sources.html

  49. Wang C et al (2018) Multiple kernel clustering with global and local structure alignment. IEEE Access 6:77911–77920

    Article  Google Scholar 

  50. Domeniconi C, Al-Razgan M (2009) Weighted cluster ensembles: Methods and analysis. ACM Trans Knowl Discov Data (TKDD) 2(4):17

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tong Liu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, T., Martin, G., Zhu, Y. et al. Joint Robust Multi-view Spectral Clustering. Neural Process Lett 52, 1843–1862 (2020). https://doi.org/10.1007/s11063-020-10257-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-020-10257-0

Keywords

Navigation