Skip to main content

A New Clustering Algorithm with the Convergence Proof

  • Conference paper
Knowledge-Based and Intelligent Information and Engineering Systems (KES 2011)

Abstract

Conventional clustering algorithms employ a set of features; each feature participates in the clustering procedure equivalently. Recently this problem is dealt with by Locally Adaptive Clustering, LAC. However, like its traditional competitors the LAC method suffers from inefficiency in data with unbalanced clusters. In this paper a novel method is proposed which deals with the problem while it preserves LAC privilege. While LAC forces the sum of weights of the clusters to be equal, our method let them be unequal. This makes our method more flexible to conquer over falling at the local optimums. It also let the cluster centers to be more efficiently located in fitter places than its rivals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Blum, A., Rivest, R.: Training a 3-node neural networks is NP-complete. Neural Networks 5, 117–127 (1992)

    Article  Google Scholar 

  2. Chang, J.W., Jin, D.S.: A new cell-based clustering method for large, high-dimensional data in data mining applications. In: Proceedings of the ACM Symposium on Applied Computing, pp. 503–507. ACM Press, New York (2002)

    Google Scholar 

  3. Cheng, C.H., Fu, A.W., Zhang, Y.: Entropy-based subspace clustering for mining numerical data. In: Proceedings of the Fifth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 84–93. ACM Press, New York (1999)

    Chapter  Google Scholar 

  4. Domeniconi, C., Al-Razgan, M.: Weighted cluster ensembles: Methods and analysis. TKDD 2(4) (2009)

    Google Scholar 

  5. Domeniconi, C., Gunopulos, D., Ma, S., Yan, B., Al-Razgan, M., Papadopoulos, D.: Locally adaptive metrics for clustering high dimensional data. Data Min. Knowl. Discov. 14(1), 63–97 (2007)

    Article  MathSciNet  Google Scholar 

  6. Dudoit, S., Fridlyand, J.: Bagging to improve the accuracy of a clustering procedure. Bioinformatics 19(9), 1090–1099 (2003)

    Article  Google Scholar 

  7. Faceli, K., Marcilio, C.P., Souto, d.: Multi-objective Clustering Ensemble. In: Proceedings of the Sixth International Conference on Hybrid Intelligent Systems (HIS 2006) (2006)

    Google Scholar 

  8. Friedman, J.H., Meulman, J.J.: Clustering objects on subsets of variables. Journal of the Royal Statistical Society, Series B 66, 815–849 (2004)

    Article  MATH  Google Scholar 

  9. Jain, A.K., Dubes, R.C.: Algorithms for Clustering Data. Prentice Hall, Englewood Cliffs (1988)

    MATH  Google Scholar 

  10. Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2), 273–324 (1997)

    Article  MATH  Google Scholar 

  11. Liu, B., Xia, Y., Yu, P.S.: Clustering through decision tree construction. In: Proceedings of the Ninth International Conference on Information and Knowledge Management, pp. 20–29. ACM Press, New York (2000)

    Google Scholar 

  12. MacQueen, J.B.: Some Methods for classification and Analysis of Multivariate Observations. In: 5th Berkeley Symposium on Mathematical Statistics and Probability, pp. 281–297. University of California Press, Berkeley (1967)

    Google Scholar 

  13. Miller, R., Yang, Y.: Association rules over interval data. In: Proc. ACM SIGMOD International Conf. on Management of Data, pp. 452–461 (1997)

    Google Scholar 

  14. Mirzaei, A., Rahmati, M., Ahmadi, M.: A new method for hierarchical clustering combination. Intelligent Data Analysis 12(6), 549–571 (2008)

    Google Scholar 

  15. Newman, C.B.D.J., Hettich, S., Merz, C.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLSummary.html

  16. Parsons, L., Haque, E., Liu, H.: Subspace clustering for high dimensional data: a review. ACM SIGKDD Explorations Newsletter 6(1), 90–105 (2004)

    Article  Google Scholar 

  17. Srikant, R., Agrawal, R.: Mining Quantitative Association Rules in Large Relational Tables. In: Proc. of the ACM SIGMOD Conference on Management of Data, Montreal, Canada (1996)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Parvin, H., Minaei-Bidgoli, B., Alizadeh, H. (2011). A New Clustering Algorithm with the Convergence Proof. In: König, A., Dengel, A., Hinkelmann, K., Kise, K., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based and Intelligent Information and Engineering Systems. KES 2011. Lecture Notes in Computer Science(), vol 6881. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23851-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23851-2_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23850-5

  • Online ISBN: 978-3-642-23851-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics