Skip to main content

Predicting Structured Outputs k-Nearest Neighbours Method

  • Conference paper
Discovery Science (DS 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6926))

Included in the following conference series:

Abstract

In this work, we address several tasks of structured prediction and propose a new method for handling such tasks. Structured prediction is becoming important as data mining is dealing with increasingly complex data (images, videos, sound, graphs, text,...). Our method, k-NN for structured prediction (kNN-SP), is an extension of the well known k-nearest neighbours method and can handle three different structured prediction problems: multi-target prediction, hierarchical multi-label classification, and prediction of short time-series. We evaluate the performance of kNN-SP on several datasets for each task and compare it to the performance of other structured prediction methods (predictive clustering trees and rules). We show that, despite it’s simplicity, the kNN-SP method performs satisfactory on all tested problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bakir, G.H., Hofmann, T., Schölkopf, B., Smola, A.J., Taskar, B., Vishwanathan, S.V.N.: Predicting Structured Data. The MIT Press, Cambridge (2007)

    Google Scholar 

  2. Blockeel, H., De Raedt, L., Ramon, J.: Top-down induction of clustering trees. In: Proc. 15th International Conference on Machine Learning, pp. 55–63. Morgan Kaufmann, San Francisco (1998)

    Google Scholar 

  3. Blockeel, H., Struyf, J.: Efficient algorithms for decision tree cross-validation. Journal of Machine Learning Research 3, 621–650 (2002)

    MATH  Google Scholar 

  4. Bratko, I.: Prolog Programming for Artificial Intelligence, 3rd edn. Addison Wesley, Reading (2000)

    MATH  Google Scholar 

  5. Dasarathy, B.V. (ed.): Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society, Los Alamitos (1990)

    Google Scholar 

  6. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  7. Džeroski, S.: Towards a general framework for data mining. In: Džeroski, S., Struyf, J. (eds.) KDID 2006. LNCS, vol. 4747, pp. 259–300. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  8. Kocev, D.: Ensembles for Predicting Structured Outputs. PhD Thesis, Jozef Stefan International Postgraduate School, Ljubljana, Slovenia (2011)

    Google Scholar 

  9. Silla, C., Freitas, A.: A survey of hierarchical classification across different application domains. Data Mining and Knowledge Discovery 22(1-2), 31–72 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Slavkov, I., Gjorgjioski, V., Struyf, J., Dzeroski, S.: Finding explained groups of time-course gene expression profiles with predictive clustering trees. Molecular BioSystems 6, 729–740 (2010)

    Article  Google Scholar 

  11. Vens, C., Struyf, J., Schietgat, L., Džeroski, S., Blockeel, H.: Decision trees for hierarchical multi-label classification. Machine Learning 73(2), 185–214 (2008)

    Article  Google Scholar 

  12. Ženko, B., Džeroski, S.: Learning classification rules for multiple target attributes. In: Washio, T., Suzuki, E., Ting, K.M., Inokuchi, A. (eds.) PAKDD 2008. LNCS (LNAI), vol. 5012, pp. 454–465. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  13. Yianilos, P.N.: Data structures and algorithms for nearest neighbor search in general metric spaces. In: Proc. 4th ACM-SIAM Symposium on Discrete Algorithms, pp. 311–321. SIAM, Philadelphia (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pugelj, M., Džeroski, S. (2011). Predicting Structured Outputs k-Nearest Neighbours Method. In: Elomaa, T., Hollmén, J., Mannila, H. (eds) Discovery Science. DS 2011. Lecture Notes in Computer Science(), vol 6926. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24477-3_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-24477-3_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-24476-6

  • Online ISBN: 978-3-642-24477-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics