Abstract
Maximum margin clustering (MMC) and its improved version are based on the spirit of support vector machine, which will inevitably lead to prohibitively computational complexity when these learning models are encountered with an enormous amount of patterns. To accelerate the clustering efficiency, we propose alternating twin bounded support vector clustering to decompose the original large problem in MMC and its variants into two smaller sized ones, in which solving expensive semi-definite programming is avoided by performing alternating optimization between cluster-specific model parameters and instance-specific labeling assignments. Also the structural risk minimization principle is implemented to obtain good generalization. Additionally, in order to avoid premature convergence, a relaxed version of our algorithm is proposed, in which the hinge loss in the original twin bounded support vector machine is replaced with the Laplacian loss. These two versions can be easily extended to the nonlinear context via kernel tricks. To investigate the efficacy of our clustering algorithm, several experiments are conducted on a number of synthetic and real-world datasets. Experimental results demonstrate that the proposed method has better performance than other existing clustering approaches in terms of clustering accuracy and time efficiency and also possesses the powerful ability to process larger-scaled datasets.







Similar content being viewed by others
References
Jain, A., & Dubes, R. (1998). Algorithms for clustering data. Englewood Cliffs: Prentice Hall.
Hartigan, J. A., & Wong, M. A. (1979). A k-means clustering algorithm. Applied Statistics, 28, 100–108.
Shi, J., & Malik, J. (2000). Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8), 888–905.
Redner, R., & Walker, H. (1984). Mixture densities, maximum likelihood and the EM algorithm. SIAM Review, 26(2), 195–239.
Ng, A. Y., Jordan, M. I., & Weiss, Y. (2001). On spectral clustering: Analysis and an algorithm. In NIPS.
Xu, L., Neufeld, J., Larson, B., & Schuurmans, D. (2004). Maximum margin clustering. In NIPS.
Valizadegan, H., & Jin, R. (2007). Generalized maximum margin clustering and unsupervised kernel learning. In NIPS, pp. 1417–1424.
Zhang, K., Tsang, I. W., & Kowk, J. T. (2007). Maximum margin clustering made practical. In ICML.
Xu, L.,& Schuurmans, D. (2005). Unsupervised and semi-supervised multi-class support vector machines. In AAAI.
Wang, Y. X., & Xu, H. (2013). Noisy sparse subspace clustering. Journal of Machine Learning Research, 17(1), I-89.
Hershey, J. R., Chen, Z., Roux, J. L., et al. (2016). Deep clustering: Discriminative embeddings for segmentation and separation. In IEEE international conference on acoustics, speech and signal processing. IEEE, pp. 31–35.
Zhang, X., Zhang, X., & Liu, H. (2016). Self-adapted multi-task clustering. In International joint conference on artificial intelligence. AAAI Press, pp. 2357–2363.
Zhang, L., Zhang, Q., Du, B., et al. (2017). Adaptive manifold regularized matrix factorization for data clustering. In Twenty-sixth international joint conference on artificial intelligence, pp. 3399–3405.
Akbulut, Y., et al. (2016). KNCM: Kernel neutrosophic c-means clustering. Applied Soft Computing, 52, 714–724.
Lanckriet, G. R. G., et al. (2002). Learning the kernel matrix with semi-definite programming. In Nineteenth international conference on machine learning. Morgan Kaufmann, pp. 323–330.
Jayadeva, Khemchandani, R., & Chandra, S. (2007). Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5), 905.
Cristianini, Nello, & Shawe-Taylor, J. (2000). An introduction to support vector machines. Cambridge: Cambridge University Press.
Lobo, M. S., Vandenberghe, L., Boyd, S., et al. (1998). Applications of second-order cone programming. Linear Algebra and its Applications, 284(1–3), 193–228.
Nesterov, I. E., & Nemirovski, A. S. (1994). Interior point polynomial algorithms in convex programming, SAM (p. 13). Philadelphia: Studies in Applied Mathematics, SIAM.
Platt, J. C. (1999). Fast training of support vector machines using sequential minimal optimization. MIT Press, 1999, 185–208.
Acknowledgements
This work was supported by the National High Technology Research and Development Program of China (863 Program) (2011AA010706), the National Natural Science Foundation of China (61133016, 61772117), Ministry of Education-China Mobile Communications Corporation Research Funds (MCM20121041), the General Equipment Department Foundation (61403120102), and the Sichuan Hi-Tech industrialization program (2017GZ0308).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fang, J., Liu, Q. & Qin, Z. Alternating Relaxed Twin Bounded Support Vector Clustering. Wireless Pers Commun 102, 1129–1147 (2018). https://doi.org/10.1007/s11277-017-5147-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11277-017-5147-6