Skip to main content

Handling Concept Drift via Ensemble and Class Distribution Estimation Technique

  • Conference paper
Advanced Data Mining and Applications (ADMA 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7121))

Included in the following conference series:

Abstract

In real world settings there is situation where class distribution of data may change after classifier is built resulting in performance degradation of classifier. Attempts to solve this problem from previous Class Distribution Estimation method (CDE method) yield quite interesting performance however we notice there is some flaw since CDE method still have some bias toward train data thus we decide to improve them with ensemble method. Our Class Distribution Estimation-Ensemble (CDE-EM) methods estimate class distribution from many models instead of one resulting in less bias than previous method. All methods are evaluated using accuracy on set of benchmark UCI data sets. Experimental results demonstrate that our methods yield better performance if class distribution of test data is different from train data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alaiz-Rodriguez, R., Guerrero-Curieses, A., Cid-Sueiro, J.: Minimax.: Regret Classifier for Imprecise Class Distributions. Journal of Machine Learning Research 8 (2007)

    Google Scholar 

  2. Alaiz-Rodríguez, R., Japkowicz, N.: Assessing the Impact of Changing Environments on Classifier Performance. In: Bergler, S. (ed.) Canadian AI. LNCS (LNAI), vol. 5032, pp. 13–24. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  3. Asuncion, A., Newman, D.J.: UCI Machine Learning Repository, (2007), http://www.ics.uci.edu/~mlearn/MLReposito-ry.html

  4. Chapelle, O., Scholkopf, B., Zien, A.: Semi-Supervised Learning. MIT Press, Cambridge (2006)

    Book  Google Scholar 

  5. Chawla, N.V., Bowyer, K.W., Kegelmeyer, W.P.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research, 16 (2002)

    Google Scholar 

  6. Elkan, C.: The foundations of cost-sensitive learning. In: The Proceedings of the 17th International Joint Conference on Artificial Intelligence, pp. 973–978 (2001)

    Google Scholar 

  7. Forman, G.: Quantifying counts and costs via classification. Data Mining Knowledge Discovery 17(2) (2008)

    Google Scholar 

  8. Forman, G.: Counting Positives Accurately Despite Inaccurate Classification. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds.) ECML 2005. LNCS (LNAI), vol. 3720, pp. 564–575. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  9. Latinne, P., Saerens, M., Decaestecker, C.: Adjusting the Outputs of a Classifier to New a Priori Probabilities May Significantly Improve Classification Accuracy: Evidence from a multi-class problem in remote sensing. In: The Proceedings of the 18th International Conference on Machine Learning, pp. 298–305 (2001)

    Google Scholar 

  10. Provost, F., Fawcett, T.: Robust Classification for Imprecise Environments. Machine Learning 42(3) (2001)

    Google Scholar 

  11. Provost, F., Fawcett, T., Kohavi, R.: The Case against Accuracy Estimation for Comparing Induction Algorithms. In: The Proceedings of the 15th International Conference on Machine Learning, pp. 445–453 (1998)

    Google Scholar 

  12. Quinlan, R.J.: C4.5: Programs for Machine Learning. Morgan Kaufmann (1993)

    Google Scholar 

  13. Rokach, L.: Ensemble-based classifiers. Artificial Intelligence Review 33 (2010)

    Google Scholar 

  14. Saerens, M., Latinne, P., Decaestecker, C.: Adjusting the outputs of a classifier to new a priori probabilities: A simple procedure. Neural Computing 14 (2002)

    Google Scholar 

  15. Tsymbal, A.: The problem of concept drift: Definitions and related work. Computer Science Department. Trinity College Dublin (2004)

    Google Scholar 

  16. Weiss, G.M.: Mining with rarity: a unifying framework. SIGKDD Explorations Newsletter 6(1) (2004)

    Google Scholar 

  17. Weiss, G.M., Provost, F.: Learning when training data are costly: The effect of class distribution on tree induction. Journal of Artificial Intelligence Research 19 (2003)

    Google Scholar 

  18. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann (2005)

    Google Scholar 

  19. Xue, J.C., Weiss, G.M.: Quantification and semi-supervised classification methods for handling changes in class distribution. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 897–906. ACM, Paris (2009)

    Chapter  Google Scholar 

  20. Zadrozny, B., Elkan, C.: Learning and making decisions when costs and probabilities are both unknown. In: The Proceedings of the 7th International Conference on Knowledge Discovery and Data Mining, pp. 204–213 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Limsetto, N., Waiyamai, K. (2011). Handling Concept Drift via Ensemble and Class Distribution Estimation Technique. In: Tang, J., King, I., Chen, L., Wang, J. (eds) Advanced Data Mining and Applications. ADMA 2011. Lecture Notes in Computer Science(), vol 7121. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25856-5_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-25856-5_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-25855-8

  • Online ISBN: 978-3-642-25856-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics