Skip to main content

Cost-Sensitive Classification with k-Nearest Neighbors

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8041))

Abstract

Cost-sensitive learning algorithms are typically motivated by imbalance data in clinical diagnosis that contains skewed class distribution. While other popular classification methods have been improved against imbalance data, it is only unsolved to extend k-Nearest Neighbors (kNN) classification, one of top-10 datamining algorithms, to make it cost-sensitive to imbalance data. To fill in this gap, in this paper we study two simple yet effective cost-sensitive kNN classification approaches, called Direct-CS-kNN and Distance-CS-kNN. In addition, we utilize several strategies (i.e., smoothing, minimum-cost k value selection, feature selection and ensemble selection) to improve the performance of Direct-CS-kNN and Distance-CS-kNN. We conduct several groups of experiments to evaluate the efficiency with UCI datasets, and demonstrate that the proposed cost-sensitive kNN classification algorithms can significantly reduce misclassification cost, often by a large margin, as well as consistently outperform CS-4.5 with/without additional enhancements.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Domingos, P.: MetaCost: a general method for making classifiers cost-sensitive. In: Proceedings of the Fifth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 155–164 (1999)

    Google Scholar 

  2. Elkan, C.: The foundations of cost-sensitive learning. In: Nebel, B. (ed.) Proceeding of the Seventeenth International Joint Conference of Artificial Intelligence, Seattle, August 4-10, pp. 973–978. Morgan Kaufmann (2001)

    Google Scholar 

  3. Greiner, R., Grove, A.J., Roth, D.: Learning cost-sensitive active classifiers. Artificial Intelligence 139(2), 137–174 (2002)

    Article  MathSciNet  Google Scholar 

  4. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial intelligence 97(1-2), 273–324 (1997)

    Article  MATH  Google Scholar 

  5. Kotsiantis, S., Pintelas, P.: A cost sensitive technique for ordinal classification problems. In: Vouros, G.A., Panayiotopoulos, T. (eds.) SETN 2004. LNCS (LNAI), vol. 3025, pp. 220–229. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  6. Kotsiantis, S., Kanellopoulos, D., Pintelas, P.: Handling imbalanced datasets: A review. GESTS International Transactions on Computer Science and Engineering 30(1), 25–36 (2006)

    Google Scholar 

  7. Li, J., Li, X., Yao, X.: Cost-Sensitive Classification with Genetic Programming. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 3 (2005)

    Google Scholar 

  8. Ling, C.X., Yang, Q., Wang, J., Zhang, S.: Decision trees with minimal costs. In: Brodley, C.E. (ed.) Proceeding of the Twenty First International Conference on Machine Learning, Banff, Alberta, July 4-8, vol. 69, pp. 69–76. ACM Press (2004)

    Google Scholar 

  9. Margineantu, D.D.: Methods for Cost-sensitive Learning. Oregon State University (2001)

    Google Scholar 

  10. Niculescu-Mizil, A., Caruana, R.: Predicting good probabilities with supervised learning. Association for Computing Machinery, Inc., New York (2005)

    Google Scholar 

  11. Oza, N.C.: Ensemble Data Mining Methods, NASA Ame Research Center (2000)

    Google Scholar 

  12. Platt, J.C.: Probabilities for SV machines. In: Advances in Neural Information Processing Systems, pp. 61–74 (1999)

    Google Scholar 

  13. Provost, F., Domingos, P.: Tree Induction for Probability-Based Ranking. Machine Learning 52, 199–215 (2003)

    Article  MATH  Google Scholar 

  14. Quinlan, J.R.: C4.5: Programs for machine learning. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  15. Sun, Q., Pfahringer, B.: Bagging Ensemble Selection. In: Wang, D., Reynolds, M. (eds.) AI 2011. LNCS, vol. 7106, pp. 251–260. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  16. Turney, P.: Types of cost in inductive concept learning. In: Workshop on Cost-Sensitive Learning at the Seventeenth International Conference on Machine Learning, p. 1511 (2000)

    Google Scholar 

  17. Wang, T., Qin, Z., Jin, Z., Zhang, S.: Handling over-fitting in test cost-sensitive decision tree learning by feature selection, smoothing and pruning. Journal of Systems and Software (JSS) 83(7), 1137–1147 (2010)

    Article  Google Scholar 

  18. Wang, T., Qin, Z., Zhang, S.: Cost-sensitive Learning - A Survey. Accepted by International Journal of Data Warehousing and Mining (2010)

    Google Scholar 

  19. Wettschereck, D., Aha, D.W., Mohri, T.: A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artificial Intelligence Review 11(1), 273–314 (1997)

    Article  Google Scholar 

  20. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Techniques with Java Implementations, 2nd edn. Morgan Kaufmann Publishers (2000)

    Google Scholar 

  21. Wolpert, D.H.: Stacked generalization. Neural Networks 5, 241–259 (1992)

    Article  Google Scholar 

  22. Wu, X., Kumar, V., Ross Quinlan, J., Ghosh, J., Yang, Q., Motoda, H., McLachlan, G.J., Ng, A., Liu, B., Yu, P.S.: Top 10 algorithms in data mining. Knowledge and Information Systems 14(1), 1–37 (2008)

    Article  Google Scholar 

  23. Zadrozny, B., Elkan, C.: Obtaining calibrated probability estimates from decision trees and naive bayesian classifiers. In: Proceedings of the 18th International Conference on Machine Learning, pp. 609–616 (2001)

    Google Scholar 

  24. Zadrozny, B., Elkan, C.: Learning and making decisions when costs and probabilities are both unknown. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 204–213. ACM Press, San Francisco (2001)

    Chapter  Google Scholar 

  25. Zadrozny, B., Elkan, C.: Transforming classifier scores into accurate multiclass probability estimates, pp. 694–699. ACM, New York (2002)

    Google Scholar 

  26. Zadrozny, B.: One-Benefit learning: cost-sensitive learning with restricted cost information. In: Proceedings of the 1st International Workshop on Utility-Based Data Mining, pp. 53–58. ACM Press, Chicago (2005)

    Chapter  Google Scholar 

  27. Zhang, J., Mani, I.: kNN approach to unbalanced data distributions: a case study involving information extraction (2009)

    Google Scholar 

  28. Zhang, S.: KNN-CF Approach: Incorporating Certainty Factor to kNN Classification. IEEE Intelligent Informatics Bulletin 11(1) (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Qin, Z., Wang, A.T., Zhang, C., Zhang, S. (2013). Cost-Sensitive Classification with k-Nearest Neighbors. In: Wang, M. (eds) Knowledge Science, Engineering and Management. KSEM 2013. Lecture Notes in Computer Science(), vol 8041. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39787-5_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-39787-5_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-39786-8

  • Online ISBN: 978-3-642-39787-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics