Abstract
Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: (1) Component classifiers and interim ensembles are dynamically trained; (2) the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn + + .NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. Journal of Machine Learning Research 8(2), 2755–2790 (2007)
Schlimmer, J., Granger, R.: Beyond incremental processing: Tracking concept drift. In: Proceedings of National Conference on Artificial Intelligence, pp. 502–507 (1986)
Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 23(1), 69–101 (1996)
Kuncheva, L.: Classifier ensembles for changing environments. In: Proceedings of International Workshop on Multiple Classifier Systems, pp. 1–15 (2004)
Littlestone, N., Warmuth, M.: The weighted majority algorithm. Information Computation 108(2), 212–261 (1994)
Herbster, M., Warmuth, M.: Tracking the best expert. Machine Learning 32(2), 151–178 (1998)
Bousquet, O., Warmuth, M.: Tracking a small set of experts by mixing past posteriors. Journal of Machine Learning Research 3(1), 363–396 (2002)
Kolter, J., Maloof, M.: Using additive expert ensembles to cope with concept drift. In: Proceedings of International Conference on Machine Learning, pp. 449–456 (2005)
Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. In: Proceedings of IEEE International Conference on Data Mining, pp. 123–130 (2003)
Street, W., Kim, Y.: A streaming ensemble algorithm (SEA) for large-scale classification. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 377–382 (2001)
Fan, W.: Streamminer: A classifier ensemble-based engine to mine concept-drifting data streams. In: Proceedings of International Conference on Very Large Data Bases, pp. 1257–1260 (2004)
Chen, S., He, H.: SERA: Selectively recursive approach towards nonstationary imbalanced stream data mining. In: International Joint Conference on Neural Networks, pp. 522–529 (2009)
Chen, S., He, H.: Toward incremental learning of nonstationary imbalanced data stream: A multiple selectively recursive approach. Evolving Systems 2(1), 30–50 (2011)
Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Trans. Neural Networks 22(10), 1517–1531 (2011)
Shalizi, C., Jacobs, A., Klinkner, K., Clauset, A.: Adapting to non-stationarity with growing expert ensembles, arXiv:1103.09049v2 (2011)
Webb, S., Caverlee, J., Pu, C.: Introducing the webb spam corpus: using email spam to identify web spam automatically. In: Proceedings of Third Conference on Email and Anti-Spam (2006)
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intelligent Systems and Technology 2(3), 1–27 (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yin, XC., Huang, K., Hao, HW. (2013). Dynamic Ensemble of Ensembles in Nonstationary Environments. In: Lee, M., Hirose, A., Hou, ZG., Kil, R.M. (eds) Neural Information Processing. ICONIP 2013. Lecture Notes in Computer Science, vol 8227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42042-9_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-42042-9_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-42041-2
Online ISBN: 978-3-642-42042-9
eBook Packages: Computer ScienceComputer Science (R0)