Skip to main content

Network of Experts: Learning from Evolving Data Streams Through Network-Based Ensembles

  • Conference paper
  • First Online:
Book cover Neural Information Processing (ICONIP 2019)

Abstract

Ensemble classifiers are a promising approach for data stream classification. Though, diversity influences the performance of ensemble classifiers, current studies do not take advantage of relations between component classifiers to improve their performance. This paper addresses this issue by proposing a new kind of ensemble learner for data stream classification, which explicitly defines relations between component classifiers. These relations are then used in various ways, e.g., to combine the decisions of component models. The hypothesis is that an ensemble learner can yield accurate predictions in a streaming environment based on a structural analysis of a weighted network of its component models. Implications, limitations and benefits of this assumption, are discussed. A formal description of a network-based ensemble for data streams is presented, and an algorithm that implements it, named Network of Experts (NetEx). Empirical experiments show that NetEx’s accuracy and processing time are competitive with state-of-the-art ensembles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In NetEx, the number of subspaces is fixed, the number of features is the same for all classifiers.

  2. 2.

    DWM [22] is an exception as it does not include a maximum or target number of base models.

References

  1. Bifet, A., Holmes, G., Pfahringer, B.: Leveraging bagging for evolving data streams. In: PKDD, pp. 135–150 (2010)

    Google Scholar 

  2. Bifet, A., Gavaldà, R.: Learning from time-changing data with adaptive windowing. In: SIAM (2007)

    Google Scholar 

  3. Bifet, A., Holmes, G., Kirkby, R., Pfahringer, B.: MOA data stream mining - a practical approach. Centre for Open Software Innovation (2011). http://heanet.dl.sourceforge.net/project/moa-datastream/documentation/StreamMining.pdf

  4. Bifet, A., Holmes, G., Pfahringer, B., Gavaldà, R.: Improving adaptive bagging methods for evolving data streams. In: Zhou, Z.-H., Washio, T. (eds.) ACML 2009. LNCS (LNAI), vol. 5828, pp. 23–37. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-05224-8_4

    Chapter  Google Scholar 

  5. Boccaletti, S., Latora, V., Moreno, Y., Chavez, M., Hwang, D.U.: Complex networks: structure and dynamics. Phys. Rep. 424(4), 175–308 (2006)

    Article  MathSciNet  Google Scholar 

  6. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MATH  Google Scholar 

  7. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  8. Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorisation. J. Inf. Fusion 6, 5–20 (2005)

    Article  Google Scholar 

  9. Brzezinski, D., Stefanowski, J.: Combining block-based and online methods in learning ensembles from concept drifting data streams. Inf. Sci. 265, 50–67 (2014)

    Article  MathSciNet  Google Scholar 

  10. Chen, S.T., Lin, H.T., Lu, C.J.: An online boosting algorithm with theoretical justifications. In: ICML, June 2012

    Google Scholar 

  11. Da Xu, L., He, W., Li, S.: Internet of Things in industries: a survey. IEEE Trans. Industr. Inf. 10(4), 2233–2243 (2014)

    Article  Google Scholar 

  12. Dalirsefat, S.B., da Silva Meyer, A., Mirhoseini, S.Z.: Comparison of similarity coefficients used for cluster analysis with amplified fragment length polymorphism markers in the silkworm, bombyx mori. J. Insect Sci. 9(71), 1–8 (2009)

    Article  Google Scholar 

  13. Freund, Y., Schapire, R.E., et al.: Experiments with a new boosting algorithm. ICML 96, 148–156 (1996)

    Google Scholar 

  14. Gama, J., Rodrigues, P.: Issues in evaluation of stream learning algorithms. In: 15th ACM SIGKDD, pp. 329–338. ACM SIGKDD, June 2009

    Google Scholar 

  15. Gama, J., Zliobaite, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM CSUR 46(4), 44:1–44:37 (2014)

    MATH  Google Scholar 

  16. Gomes, H.M., et al.: Adaptive random forests for evolving data stream classification. Mach. Learn. 106, 1–27 (2017)

    Article  MathSciNet  Google Scholar 

  17. Gomes, H.M., Barddal, J.P., Enembreck, F., Bifet, A.: A survey on ensemble learning for data stream classification. ACM CSUR 50(2), 23:1–23:36 (2017)

    Google Scholar 

  18. Gomes, H.M., Enembreck, F.: SAE: Social adaptive ensemble classifier for data streams. In: CIDM, pp. 199–206 (2013)

    Google Scholar 

  19. Gomes, H.M., Enembreck, F.: SAE2: advances on the social adaptive ensemble classifier for data streams. In: SAC. ACM, March 2014

    Google Scholar 

  20. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)

    Article  Google Scholar 

  21. Holmes, G., Kirkby, R., Pfahringer, B.: Stress-testing Hoeffding trees. In: PKDD, pp. 495–502 (2005)

    Chapter  Google Scholar 

  22. Kolter, J.Z., Maloof, M.A.: Dynamic weighted majority: an ensemble method for drifting concepts. J. Mach. Learn. Res. 8, 2755–2790 (2007)

    MATH  Google Scholar 

  23. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004)

    Book  Google Scholar 

  24. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51(2), 181–207 (2003)

    Article  Google Scholar 

  25. Kuncheva, L.I., Whitaker, C.J., Shipp, C.A., Duin, R.P.: Limits on the majority vote accuracy in classifier fusion. Pattern Anal. Appl. 6(1), 22–31 (2003)

    Article  MathSciNet  Google Scholar 

  26. Levandowsky, M., Winter, D.: Distance between sets. Nature 234(5323), 34–35 (1971)

    Article  Google Scholar 

  27. Nikolentzos, G., Meladianos, P., Limnios, S., Vazirgiannis, M.: A degeneracy framework for graph similarity. In: IJCAI, pp. 2595–2601 (2018)

    Google Scholar 

  28. Oza, N.: Online bagging and boosting. In: IEEE SMC, vol. 3, pp. 2340–2345 (2005)

    Google Scholar 

  29. Silva, T.C., Zhao, L.: Machine Learning in Complex Networks, vol. 2016. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-17290-3

    Book  MATH  Google Scholar 

  30. Wasserman, S., Faust, K.: Social Network Analysis: Methods and Applications, vol. 8. Cambridge University Press, Cambridge (1994)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Heitor Murilo Gomes .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gomes, H.M., Bifet, A., Fournier-Viger, P., Granatyr, J., Read, J. (2019). Network of Experts: Learning from Evolving Data Streams Through Network-Based Ensembles. In: Gedeon, T., Wong, K., Lee, M. (eds) Neural Information Processing. ICONIP 2019. Lecture Notes in Computer Science(), vol 11953. Springer, Cham. https://doi.org/10.1007/978-3-030-36708-4_58

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36708-4_58

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36707-7

  • Online ISBN: 978-3-030-36708-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics