Abstract
Current multi-view clustering algorithms use multistage strategies to conduct clustering, or require cluster number or similarity matrix prior, or suffer influence of irrelevant features and outliers. In this paper, we propose a Joint Robust Multi-view (JRM) spectral clustering algorithm that considers information from all views of the multi-view dataset to conduct clustering and solves the issues, such as initialization, cluster number determination, similarity measure, feature selection, and outlier reduction around clustering, in a unified way. The optimal performance could be reached when all views are considered and the separated stages are combined in a unified way. Experiments have been performed on six real-world benchmark datasets and our proposed JRM algorithm outperforms the comparison clustering algorithms in terms of two evaluation metrics for clustering algorithms including accuracy and purity.
Similar content being viewed by others
References
Aljarah I et al (2020) Clustering analysis using a novel locality-informed grey wolf-inspired clustering approach. Knowl Inf Syst 62:507–539
Zhang Z et al (2018) Binary multi-view clustering. IEEE Trans Pattern Anal Mach Intell 41(7):1774–1782
Yu H et al (2019) An active three-way clustering method via low-rank matrices for multi-view data. Inf Sci 507:823–839
Wang N et al (2019) Structured sparse multi-view feature selection based on weighted hinge loss. Multimed Tools Appl 78(11):15455–15481
Li J et al (2018) Feature selection: a data perspective. ACM Comput Surv (CSUR) 50(6):94
Sun S (2013) A survey of multi-view machine learning. Neural Comput Appl 23(7–8):2031–2038
Yin Q et al (2015) Multi-view clustering via pairwise sparse subspace representation. Neurocomputing 156:12–21
Wang H et al (2019) A study of graph-based system for multi-view clustering. Knowl-Based Syst 163:1009–1019
Nie F, Tian L, Li X (2018) Multiview clustering via adaptively weighted procrustes. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. ACM, pp 2022–2030
Brbić M, Kopriva I (2018) Multi-view low-rank sparse subspace clustering. Pattern Recognit 73:247–258
Solorio-Fernández S, Carrasco-Ochoa JA, Martínez-Trinidad JF (2020) A review of unsupervised feature selection methods. Artif Intell Rev 53:907–948
Nie F et al (2010) Efficient and robust feature selection via joint ℓ2, 1-norms minimization. In: Advances in neural information processing systems
Nie F et al (2017) Auto-weighted multi-view learning for image clustering and semi-supervised classification. IEEE Trans Image Process 27(3):1501–1511
Zheng Q et al (2019) Feature concatenation multi-view subspace clustering. arXiv preprint arXiv:1901.10657
Zhu X, Zhu Y, Zheng W (2019) Spectral Rotation for deep one-step clustering. Pattern Recognit. https://doi.org/10.1016/j.patcog.2019.107175
Arshad H et al (2019) Multi-level features fusion and selection for human gait recognition: an optimized framework of Bayesian model and binomial distribution. Int J Mach Learn Cybern 10:3601–3618
Kumar A, Rai P, Daume H (2011) Co-regularized multi-view spectral clustering. In: Advances in neural information processing systems, pp 1413–1421
Cano A (2017) An ensemble approach to multi-view multi-instance learning. Knowl-Based Syst 136:46–57
Zhu X et al (2018) One-step multi-view spectral clustering. IEEE Trans Knowl Data Eng 31(10):2022–2034
Zhu X et al (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044
Zheng W et al (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recognit Lett. https://doi.org/10.1016/j.patrec.2018.06.029
Zhu X et al (2019) Low-rank sparse subspace for spectral clustering. IEEE Trans Knowl Data Eng 31(8):1532–1543
Zhu X et al (2019) Efficient utilization of missing data in cost-sensitive learning. IEEE Trans Knowl Data Eng. https://doi.org/10.1109/TKDE.2019.2956530
Zhang Z et al (2017) Robust neighborhood preserving projection by nuclear/L2, 1-norm regularization for image feature extraction. IEEE Trans Image Process 26(4):1607–1622
Zhao M et al (2018) Trace ratio criterion based discriminative feature selection via l2, p-norm regularization for supervised learning. Neurocomputing 321:1–16
Zhu X et al (2019) Spectral clustering via half-quadratic optimization. World Wide Web. https://doi.org/10.1007/s11280-019-00731-8
Hu R et al (2019) Robust SVM with adaptive graph learning. World Wide Web. https://doi.org/10.1007/s11280-019-00766-x
Knox EM, Ng RT (1998) Algorithms for mining distance-based outliers in large datasets. In: Proceedings of the 24th international conference on very large data bases. Citeseer, 1998, pp 392–403
Suri NR, Athithan G (2019) Research issues in outlier detection. In: Outlier detection: techniques and applications. Springer, pp 29–51
Yu Y-F et al (2019) Joint transformation learning via the L2, 1-norm metric for robust graph matching. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2019.2912718
Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Advances in neural information processing systems, 2007, pp 41–48
Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient l 2, 1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence. AUAI Press. arXiv preprint arXiv:1205.2631
Jiang B, Ding C (2017) Outlier regularization for vector data and L21 norm robustness. arXiv preprint arXiv:1706.06409
Singh A, Yadav A, Rana A (2013) K-means with three different distance metrics. Int J Comput Appl 67(10):13–17
Doad PK, Mahip MB (2013) Survey on clustering algorithm & diagnosing unsupervised anomalies for network security. Int J Curr Eng Technol 3:2122–2125
Nie F et al (2011) Unsupervised and semi-supervised learning via ℓ 1-norm graph. In: ICCV 2011, IEEE, pp 2268–2273
Barron JT (2017) A more general robust loss function. arXiv preprint arXiv:1701.03077
Geman S, McClure DE (1987) Statistical methods for tomographic image reconstruction. Bull Int Stat Inst 52(4):5–21
Huber PJ (2011) Robust statistics. In: International Encyclopedia of Statistical Science. Springer
Nikolova M, Chan RH (2007) The equivalence of half-quadratic minimization and the gradient linearization iteration. IEEE Trans Image Process 16(6):1623–1627
Black MJ, Rangarajan A (1996) On the unification of line processes, outlier rejection, and robust statistics with applications in early vision. Int J Comput Vis 19(1):57–91
Zheng W et al (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755
Lei C, Zhu X (2018) Unsupervised feature selection via local structure learning and sparse learning. Multimed Tools Appl 77(22):29605–29622
Voloshinov VV (2018) A generalization of the Karush–Kuhn–Tucker theorem for approximate solutions of mathematical programming problems based on quadratic approximation. Comput Math Math Phys 58(3):364–377
Nilsback M-E, Zisserman A. 17 Category Flower Dataset, U.o. Oxford, Editor. Retrieved from http://www.robots.ox.ac.uk/~vgg/data/flowers/17/
Fei-Fei L, Fergus R, Perona P (2006) One-shot learning of object categories. IEEE Trans Pattern Anal Mach Intell 28(4):594–611
Grimal C. WebKB, U.o.C. LINQS, Santa Cruz, Editor. Retrieved from https://linqs.soe.ucsc.edu/
Greene D. 3-sources, U.C. Dublin, Editor. Retrieved from http://mlg.ucd.ie/datasets/3sources.html
Wang C et al (2018) Multiple kernel clustering with global and local structure alignment. IEEE Access 6:77911–77920
Domeniconi C, Al-Razgan M (2009) Weighted cluster ensembles: Methods and analysis. ACM Trans Knowl Discov Data (TKDD) 2(4):17
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, T., Martin, G., Zhu, Y. et al. Joint Robust Multi-view Spectral Clustering. Neural Process Lett 52, 1843–1862 (2020). https://doi.org/10.1007/s11063-020-10257-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-020-10257-0