Skip to main content

The AQ Methods for Concept Drift

  • Chapter
Advances in Machine Learning I

Part of the book series: Studies in Computational Intelligence ((SCI,volume 262))

Abstract

Since the mid-1990’s, we have developed, implemented, and evaluated a number of learning methods that cope with concept drift. Drift occurs when the target concept that a learner must acquire changes over time. It is present in applications involving user preferences (e.g., calendar scheduling) and adversaries (e.g., spam detection). We based early efforts on Michalski’s aq algorithm, and our more recent work has investigated ensemble methods. We have also implemented several methods that other researchers have proposed. In this chapter, we survey results that we have obtained since the mid-1990’s using the Stagger concepts and learning methods for concept drift. We examine our methods based on the aq algorithm, our ensemble methods, and the methods of other researchers. Dynamic weighted majority with an incremental algorithm for producing decision trees as the base learner achieved the best overall performance on this problem with an area under the performance curve after the first drift point of .882. Systems based on the aq11 algorithm, which incrementally induces rules, performed comparably, achieving areas of .875. Indeed, an aq11 system with partial instance memory and Widmer and Kubat’s window adjustment heuristic achieved the best performance with an overall area under the performance curve, with an area of .898.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Alpaydin, E.: Introduction to Machine Learning. MIT Press, Cambridge (2004)

    Google Scholar 

  2. Bach, S.H., Maloof, M.A.: Paired learners for concept drift. In: Proceedings of the Eighth IEEE International Conference on Data Mining. IEEE Press, Los Alamitos (2008)

    Google Scholar 

  3. Becker, H., Arias, M.: Real-time ranking with concept drift using expert advice. In: Proceedings of the Thirteenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 86–94. ACM Press, New York (2007)

    Google Scholar 

  4. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)

    Book  MATH  Google Scholar 

  5. Blum, A.: Empirical support for winnow and weighted-majority algorithms: Results on a calendar scheduling domain. Machine Learning 26, 5–23 (1997)

    Article  MathSciNet  Google Scholar 

  6. Fan, W.: StreamMiner: A classifier ensemble-based engine to mine concept-drifting data streams. In: Proceedings of the Thirtieth International Conference on Very Large Data Bases, pp. 1257–1260. Morgan Kaufmann, San Francisco (2004)

    Google Scholar 

  7. Gama, J., Medas, P., Rodrigues, P.: Learning decision trees from dynamic data streams. In: Proceedings of the 2005 ACM Symposium on Applied Computing (SAC 2005), pp. 573–577. ACM Press, New York (2005)

    Chapter  Google Scholar 

  8. Harries, M., Sammut, C., Horn, K.: Extracting hidden context. Machine Learning 32(2), 101–126 (1998)

    Article  MATH  Google Scholar 

  9. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, Heidelberg (2001)

    MATH  Google Scholar 

  10. Hulten, G., Spencer, L., Domingos, P.: Mining time-changing data streams. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 97–106. ACM Press, New York (2001)

    Chapter  Google Scholar 

  11. Kolter, J.Z., Maloof, M.A.: Dynamic weighted majority: A new ensemble method for tracking concept drift. In: Proceedings of the Third IEEE International Conference on Data Mining, pp. 123–130. IEEE Press, Los Alamitos (2003)

    Google Scholar 

  12. Kolter, J.Z., Maloof, M.A.: Using additive expert ensembles to cope with concept drift. In: Proceedings of the Twenty-second International Conference on Machine Learning, pp. 449–456. ACM Press, New York (2005)

    Google Scholar 

  13. Kolter, J.Z., Maloof, M.A.: Dynamic weighted majority: An ensemble method for drifting concepts. Journal of Machine Learning Research 8, 2755–2790 (2007)

    Google Scholar 

  14. Langley, P.W.: Elements of Machine Learning. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  15. Littlestone, N.: Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning 2, 285–318 (1988)

    Google Scholar 

  16. Littlestone, N., Warmuth, M.K.: The weighted majority algorithm. Information and Computation 108, 212–261 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  17. Maloof, M.A.: Progressive partial memory learning. Ph.D. thesis, School of Information Technology and Engineering, George Mason University, Fairfax, VA (1996)

    Google Scholar 

  18. Maloof, M.A.: Incremental rule learning with partial instance memory for changing concepts. In: Proceedings of the International Joint Conference on Neural Networks, pp. 2764–2769. IEEE Computer Society Press, Los Alamitos (2003)

    Google Scholar 

  19. Maloof, M.A.: Concept drift. In: Wang, J. (ed.) Encyclopedia of Data Warehousing and Mining, pp. 202–206. Information Science Publishing, Hershey (2005)

    Google Scholar 

  20. Maloof, M.A., Michalski, R.S.: Selecting examples for partial memory learning. Machine Learning 41, 27–52 (2000)

    Article  Google Scholar 

  21. Maloof, M.A., Michalski, R.S.: Incremental learning with partial instance memory. Artificial Intelligence 154, 95–126 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  22. Michalski, R.S.: On the quasi-minimal solution of the general covering problem. In: Proceedings of the Fifth International Symposium on Information Processing, vol. A3, pp. 125–128 (1969)

    Google Scholar 

  23. Michalski, R.S., Larson, J.B.: Selection of most representative training examples and incremental generation of VL1 hypotheses: The underlying methodology and the description of programs ESEL and AQ11. Technical Report UIUCDCS-R-78-867, Department of Computer Science, University of Illinois, Urbana (1978)

    Google Scholar 

  24. Michalski, R.S., Mozetic, I., Hong, J., Lavrac, H.: The multi-purpose incremental learning system AQ15 and its testing application to three medical domains. In: Proceedings of the Fifth National Conference on Artificial Intelligence, pp. 1041–1045. AAAI Press, Menlo Park (1986)

    Google Scholar 

  25. Mitchell, T.M.: Machine Learning. McGraw-Hill, New York (1997)

    MATH  Google Scholar 

  26. Mitchell, T.M., Caruana, R., Freitag, D., McDermott, J., Zabowski, D.: Experience with a learning personal assistant. Communications of the ACM 37(7), 80–91 (1994)

    Article  Google Scholar 

  27. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  28. Russell, S.J., Norvig, P.: Artificial Intelligence: A Modern Approach, 2nd edn. Prentice Hall, Upper Saddle River (2003)

    Google Scholar 

  29. Schlimmer, J.C.: Concept acquisition through representational adjustment. Ph.D. thesis, Department of Information and Computer Science, University of California, Irvine (1987)

    Google Scholar 

  30. Schlimmer, J.C., Granger, R.H.: Beyond incremental processing: Tracking concept drift. In: Proceedings of the Fifth National Conference on Artificial Intelligence, pp. 502–507. AAAI Press, Menlo Park (1986)

    Google Scholar 

  31. Street, W.N., Kim, Y.: A streaming ensemble algorithm (SEA) for large-scale classification. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 377–382. ACM Press, New York (2001)

    Chapter  Google Scholar 

  32. Utgoff, P.E., Berkman, N.C., Clouse, J.A.: Decision tree induction based on efficient tree restructuring. Machine Learning 29, 5–44 (1997)

    Article  MATH  Google Scholar 

  33. Wang, H., Fan, W., Yu, P.S., Han, J.: Mining concept-drifting data streams using ensemble classifiers. In: Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 226–235. ACM Press, New York (2003)

    Chapter  Google Scholar 

  34. Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 23, 69–101 (1996)

    Google Scholar 

  35. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Maloof, M.A. (2010). The AQ Methods for Concept Drift. In: Koronacki, J., Raś, Z.W., Wierzchoń, S.T., Kacprzyk, J. (eds) Advances in Machine Learning I. Studies in Computational Intelligence, vol 262. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05177-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-05177-7_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-05176-0

  • Online ISBN: 978-3-642-05177-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics