Skip to main content

Comparative Study on Information Theoretic Clustering and Classical Clustering Algorithms

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2012 (ICANN 2012)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7553))

Included in the following conference series:

Abstract

This paper proposes a comparative empirical study on algorithms for clustering. We tested the method proposed in [2] using distinct synthetic and real (gene expression) datasets. We chose synthetic datasets with different spatial complex to verify the applicability of the algorithm. We also evaluated the IT algorithm in real-life problems by using microarray gene expression datasets. Compared with simple but still spread used classical algorithms k-means, hierarchical clustering and finite mixture of Gaussians, the IT algorithm showed to be more robust for both proposed scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Araujo, D., Dória Neto, A., Martins, A., Melo, J.: Comparative study on dimension reduction techniques for cluster analysis of microarray data. In: The 2011 International Joint Conference on Neural Networks (IJCNN), pp. 1835–1842 (August 2011)

    Google Scholar 

  2. de Araújo, D., Neto, A.D., Melo, J., Martins, A.: Clustering Using Elements of Information Theory. In: Diamantaras, K., Duch, W., Iliadis, L.S. (eds.) ICANN 2010, Part III. LNCS, vol. 6354, pp. 397–406. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  3. Chowdary, D., Lathrop, J., Skelton, J., Curtin, K., Briggs, T., Zhang, Y., Yu, J., Wang, Y., Mazumder, A.: Prognostic gene expression signatures can be measured in tissues collected in RNAlater preservative. J. Mol. Diagn. 8(1), 31–39 (2006)

    Article  Google Scholar 

  4. Duda, R., Hart, P., Stork, D.: Pattern Classification. Wiley (2001)

    Google Scholar 

  5. Golub, T.R., Slonim, D.K., Tamayo, P., Huard, C., Gaasenbeek, M., Mesirov, J.P., Coller, H., Loh, M.L., Downing, J.R., Caligiuri, M.A., Bloomfield, C.D., Lander, E.S.: Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science 286(5439), 531–537 (1999)

    Article  Google Scholar 

  6. Jain, A.K., Dubes, R.C.: Algorithms for clustering data. Prentice-Hall, Inc., Upper Saddle River (1988)

    MATH  Google Scholar 

  7. Kuncheva, L., Hadjitodorov, S., Todorova, L.: Experimental comparison of cluster ensemble methods. In: 2006 9th International Conference on Information Fusion, pp. 1–7 (July 2006)

    Google Scholar 

  8. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley-Interscience (2004)

    Google Scholar 

  9. van der Maaten, L.J.P., Postma, E.O., van den Herik, H.J.: Dimensionality Reduction: A Comparative Review (2007), http://www.cs.unimaas.nl/l.vandermaaten/dr/DR_draft.pdf

  10. Martins, A.M., Dória Neto, A., Costa, J.D., Costa, J.A.F.: Clustering using neural networks and kullback-leibler divergency. In: Proc. of IEEE International Joint Conference on Neural Networks, vol. 4, pp. 2813–2817.

    Google Scholar 

  11. Principe, J.C.: Information theoretic learning, ch. 7. John Wiley (2000)

    Google Scholar 

  12. Príncipe, J.: Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Information Science and Statistics. Springer (2010)

    Google Scholar 

  13. Principe, J.C., Xu, D.: Information-theoretic learning using renyi’s quadratic entropy. In: Proceedings of the First International Workshop on Independent Component Analysis and Signal Separation, Aussois, pp. 407–412 (1999)

    Google Scholar 

  14. Rao, S., de Medeiros Martins, A., Príncipe, J.C.: Mean shift: An information theoretic perspective. Pattern Recogn. Lett. 30(3), 222–230 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Araújo, D., Neto, A.D., Martins, A. (2012). Comparative Study on Information Theoretic Clustering and Classical Clustering Algorithms. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds) Artificial Neural Networks and Machine Learning – ICANN 2012. ICANN 2012. Lecture Notes in Computer Science, vol 7553. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33266-1_57

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33266-1_57

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33265-4

  • Online ISBN: 978-3-642-33266-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics