Skip to main content
Log in

Stability-based preference selection in affinity propagation

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Recently, as one of the most popular exemplar-based clustering algorithms, affinity propagation has attracted a great amount of attention in various fields. The advantages of affinity propagation include the efficiency, insensitivity to cluster initialization and capability of finding clusters with less error. However, one shortcoming of the affinity propagation algorithm is that, the clustering results generated by affinity propagation strongly depend on the selection of exemplar preferences, which is a challenging model selection task. To tackle this problem, this paper investigates the clustering stability of affinity propagation for automatically selecting appropriate exemplar preferences. The basic idea is to define a novel stability measure for affinity propagation, based on which we can select exemplar preferences that generate the most stable clustering results. Consequently, the proposed approach is termed stability-based affinity propagation (SAP). Experimental results conducted on extensive real-world datasets have validated the effectiveness of the proposed SAP algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. Exemplar is a data point that best represents a cluster. It is also termed cluster center or prototype in the literature of data clustering.

References

  1. Akaik H (1974) A new look at statistical model identification. IEEE Trans Autom Cont 19:716–723

    Article  Google Scholar 

  2. Asuncion A, Newman D (2007) UCI machine learning repository, http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Bandyopadhyay S (2011) Multiobjective simulated annealing for fuzzy clustering with stability and validity. IEEE Trans Syst Man Cybern Part C Appl Rev 41(5):682–691

    Article  Google Scholar 

  4. Ben-David S, von Luxburg U (2008) Relating clustering stability to properties of cluster boundaries. In: COLT 2008, pp. 379–390

  5. Ben-David S, Pal D, Simon HU (2007) Stability of \(k\)-means clustering. In: Learning Theory, Lecture Notes in Computer Science 2007, vol 4539, pp 20–34

  6. Bozdogan H (1987) Model selection and Akaike’s information criterion (AIC): the general theory and its analytical extensions. Psychometrika 52:345–370

    Article  MATH  MathSciNet  Google Scholar 

  7. Ding S, Ma G, Shi Z (2013) A novel self-adaptive extreme learning machine based on affinity propagation for radial basis function neural network. Neural Computing and Applications (in press)

  8. Dueck D, Frey BJ, Jojic N, Jojic V, Giaever G, Emili A, Musso G, Hegele R (2008) Constructing treatment portfolios using affinity propagation. In: RECOMB, pp. 360–371

  9. Frey BJ, Dueck D (2007) Clustering by passing messages between data points. Science 315:972–976. http://www.psi.toronto.edu/index.php?q=affinity%20propagation

  10. Givoni IE, Chung C, Frey BJ (2011) Hierarchical affinity propagation. In: UAI, pp 238–246

  11. Givoni IE, Frey BJ (2009) A binary variable model for affinity propagation. Neural Comput 21(6):1589–1600

    Article  MATH  MathSciNet  Google Scholar 

  12. Givoni IE, Frey BJ (2009) Semi-supervised affinity propagation with instance-level constraints. In: AISTATS, pp. 161–168

  13. Guo P, Jiang Z, Lin S, Yao Y (2012) Combining LVQ with SVM technique for image semantic annotation. Neural Comput Appl 21:735–746

    Article  Google Scholar 

  14. Hansen MH, Yu B (2001) Model selection and the principle of minimum description length. J Am Stat Assoc 96:746–774

    Article  MATH  MathSciNet  Google Scholar 

  15. Hu X, Xu L (2003) A comparative study of several cluster number selection criteria. In: Intelligent data engineering and automated learning 2003, pp 195–202

  16. Huang TH, Cheng KY, Chuang YY (2009) A collaborative benchmark for region of interest detection algorithms. In: CVPR, pp 296–303

  17. Hull JJ (1994) A database for handwritten text recognition research. IEEE Trans Pattern Anal Mach Intell 16(5):550–554

    Article  Google Scholar 

  18. Kschischang FR, Frey BJ, Loeliger HA (2001) Factor graphs and the sum-product algorithm. IEEE Trans Inf Theory 47(2):498–519

    Article  MATH  MathSciNet  Google Scholar 

  19. Kuncheva LI, Vetrov DP (2006) Evaluation of stability of k-means cluster ensembles with respect to random initialization. IEEE Trans Pattern Anal Mach Intell 28(11):1798–1808

    Article  Google Scholar 

  20. Lange T, Braun ML, Roth V, Buhmann JM (2002) Stability-based model selection. In: NIPS 2002, pp 617–624

  21. Lange T, Roth V, Braun ML, Buhmann JM (2004) Stability-based validation of clustering solutions. Neural Comput 16:1299–1323

    Article  MATH  Google Scholar 

  22. LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition, http://yann.lecun.com/exdb/mnist/. Proceedings of the IEEE 86(11):2278–2324

  23. von Luxburg U (2009) Clustering stability: an overview. Found Trends Mach Learn 2(3):235–274

    Article  MATH  Google Scholar 

  24. MacQueen J (1967) Some methods for classification and analysis of multivariate observations. In: Proceedings. of the Fifth Berkeley symposium on mathematical statistics and probability, University of California Press, California, vol 1, pp 281–297

  25. Rakhlin A, Caponnetto A (2006) Stability of \(k\)-means clustering. In: NIPS 2006, pp 1121–1128

  26. Rinaldo A, Singh A, Nugent R, Wasserman L (2012) Stability of density-based clustering. J Mach Learn Res 13:905–948

    MATH  MathSciNet  Google Scholar 

  27. Roth V, Braun ML, Lange T, Buhmann JM (2002) Stability-based model order selection in clustering with applications to gene expression data. In: Proceedings of ICANN 2002, pp 607–612

  28. Shang F, Jiao L, Shi J, Wang F, Gong M (2012) Fast affinity propagation clustering: a multilevel approach. Pattern Recogn 45:474–486

    Article  Google Scholar 

  29. Strehl A, Ghosh J (2002) Cluster ensembles—a knowledge reuse framework for combining multiple partitions. J Mach Learn Res 3:583–617

    MathSciNet  Google Scholar 

  30. Sumedha ML, Weigt M (2008) Unsupervised and semi-supervised clustering by message passing: soft-constraint affinity propagation. Eur Phys J B 66:125–135

    Article  Google Scholar 

  31. Tarlow D, Zemel RS, Frey BJ (2008) Flexible priors for exemplar-based clustering. In: UAI, pp 537–545

  32. Verma R, Wang P (2007) On detecting subtle pathology via tissue clustering of multi-parametric data using affinity propagation. In: ICCV, pp 1–8

  33. Vinh NX, Epps J, Bailey J (2009) Information theoretic measures for clusterings comparison: Is a correction for chance necessary? In: ICML 2009, pp 1073–1080

  34. Wang CD, Lai JH (2011) Energy based competitive learning. Neurocomputing 74:2265–2275

    Article  MathSciNet  Google Scholar 

  35. Wang CD, Lai JH, Huang D (2011) Kernel-based clustering with automatic cluster number selection. In: ICDM Workshops 2011, pp 293–299

  36. Wang CD, Lai JH, Suen CY, Zhu JY (2013) Multi-exemplar affinity propagation. IEEE Trans Pattern Anal Mach Intell 35(9):2223–2237

    Article  Google Scholar 

  37. Wang CD, Lai JH, Zhu JY (2012) Graph-based multiprototype competitive learning and its applications. IEEE Trans Syst Man Cybern C Appl Rev 42(6):934–946

    Article  Google Scholar 

  38. Xiao J, Wang J, Tan P, Quan L (2007) Joint affinity propagation for multiple view segmentation. In: ICCV, pp 1–7

  39. Xu B, Hu R, Guo P (2013) Combining affinity propagation with supervised dictionary learning for image classification. Neural Comput Appl 22:1301–1308

    Article  Google Scholar 

  40. Xu L, Krzyzak A, Oja E (1993) Rival penalized competitive learning for clustering analysis, RBF net, and curve detection. IEEE Trans Neural Netw 4(4):636–649. doi:10.1109/72.238318

    Article  Google Scholar 

  41. Yang D, Guo P (2011) Image modeling with combined optimization techniques for image semantic annotation. Neural Comput Appl 20:1001–1015

    Article  Google Scholar 

  42. Zhang X, Furtlehner C, Sebag M (2008) Data streaming with affinity propagation. In: ECML, pp 628–643

  43. Zhao ZQ, Gao J, Glotin H, Wu X (2010) A matrix modular neural network based on task decomposition with subspace division by adaptive affinity propagation clustering. Appl Math Model 34:3884–3895

    Article  MATH  MathSciNet  Google Scholar 

  44. Zhou P, Li D, Wu H, Cheng F (2011) The automatic model selection and variable kernel width for RBF neural networks. Neurocomputing 74:3628–3637

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by NSFC (Grant No. 61170136, 61373101 and 71101096), NSFC-GD Project: U1201252, Research Training Program of Sun Yat-sen University (Grant No. 13lgpyyd03), Pilot Program of SYSU-CMU Shunde International Joint Research Institute.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chang-Dong Wang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, DW., Sheng, JQ., Chen, JJ. et al. Stability-based preference selection in affinity propagation. Neural Comput & Applic 25, 1809–1822 (2014). https://doi.org/10.1007/s00521-014-1671-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-014-1671-4

Keywords

Navigation