Skip to main content

An Empirical Study of Fuzzy Decision Tree for Gradient Boosting Ensemble

  • Conference paper
  • First Online:
Book cover AI 2021: Advances in Artificial Intelligence (AI 2022)

Abstract

Gradient boosting has been proved to be an effective ensemble learning paradigm to combine multiple weak learners into a strong one. However, its improved performance is still limited by decision errors caused by uncertainty. Fuzzy decision trees are designed to solve the uncertainty problems caused by the collected information’s limitation and incompleteness. This paper investigates whether the robustness of gradient boosting can be improved by using fuzzy decision trees even when the decision conditions and objectives are fuzzy. We first propose and implement a fuzzy decision tree (FDT) by referring to two widely cited fuzzy decision trees. Then we propose and implement a fuzzy gradient boosting decision tree (FGBDT), which integrates a set of FDTs as weak learners. Both the algorithms can be set as non-fuzzy algorithms by parameters. To study whether fuzzification can improve the proposed algorithms in classification tasks, we pair the algorithms with their non-fuzzy algorithms and run comparison experiments on UCI Repository datasets in the same settings. The experiments show that the fuzzy algorithms perform better than their non-fuzzy algorithms in many classical classification tasks. The code is available at github.com/ZhaoqingLiu/FuzzyTrees.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/ZhaoqingLiu/FuzzyTrees.

References

  1. Armand, S., Watelain, E., Roux, E., Mercier, M., Lepoutre, F.X.: Linking clinical measurements and kinematic gait patterns of toe-walking using fuzzy decision trees. Gait Posture 25(3), 475–484 (2007)

    Article  Google Scholar 

  2. Beynon, M.J., Peel, M.J., Tang, Y.C.: The application of fuzzy decision tree analysis in an exposition of the antecedents of audit fees. Omega 32(3), 231–244 (2004)

    Article  Google Scholar 

  3. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC Press, Boca Raton (1984)

    MATH  Google Scholar 

  4. Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: SIGKDD, pp. 785–794. ACM (2016)

    Google Scholar 

  5. Chen, Z.: The application of tree-based model to unbalanced German credit data analysis. In: MATEC Web of Conferences, vol. 232, p. 01005. EDP Sciences (2018)

    Google Scholar 

  6. De Luca, A., Termini, S.: A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control 20(4), 301–312 (1972)

    Article  MathSciNet  Google Scholar 

  7. Dorogush, A.V., Ershov, V., Gulin, A.: CatBoost: gradient boosting with categorical features support. CoRR arXiv:1810.11363 (2018)

  8. Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml

  9. Dunn, J.C.: A fuzzy relative of the ISODATA process and its use in detecting compact well-separated clusters. J. Cybern. 3, 32–57 (1973)

    Article  MathSciNet  Google Scholar 

  10. Fang, Z., Lu, J., Liu, F., Xuan, J., Zhang, G.: Open set domain adaptation: theoretical bound and algorithm. IEEE Trans. Neural Netw. Learn. Syst. 32, 4309–4322 (2019)

    Article  MathSciNet  Google Scholar 

  11. Fang, Z., Lu, J., Liu, F., Zhang, G.: Unsupervised domain adaptation with sphere retracting transformation. In: IJCNN, pp. 1–8. IEEE (2019)

    Google Scholar 

  12. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  Google Scholar 

  13. Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2001)

    Article  MathSciNet  Google Scholar 

  14. Hoeffding, W.: Probability inequalities for sums of bounded random variables. In: The Collected Works of Wassily Hoeffding, pp. 409–426. Springer, Cham (1994). https://doi.org/10.1007/978-1-4612-0865-5_26

  15. Ke, G., et al.: LightGBM: a highly efficient gradient boosting decision tree. In: NeurIPS, pp. 3146–3154 (2017)

    Google Scholar 

  16. Kosko, B., Burgess, J.C.: Neural networks and fuzzy systems (1998)

    Google Scholar 

  17. Kotsiantis, S.B., Pintelas, P.E.: Logitboost of simple Bayesian classifier. Informatica (Slovenia) 29(1), 53–60 (2005)

    Google Scholar 

  18. Ledezma, A., Aler, R., Sanchis, A., Borrajo, D.: Empirical evaluation of optimized stacking configurations. In: ICTAI, pp. 49–55. IEEE Computer Society (2004)

    Google Scholar 

  19. Liu, A., Lu, J., Zhang, G.: Concept drift detection: dealing with missing values via fuzzy distance estimations. IEEE Trans. Fuzzy Syst. 29, 3219–3233 (2020)

    Article  Google Scholar 

  20. Liu, A., Lu, J., Zhang, G.: Concept drift detection via equal intensity k-means space partitioning. IEEE Trans. Cybern. 51, 3198–3211 (2020)

    Article  Google Scholar 

  21. Lu, J., Liu, A., Dong, F., Gu, F., Gama, J., Zhang, G.: Learning under concept drift: a review. IEEE Trans. Knowl. Data Eng. 31(12), 2346–2363 (2018)

    Google Scholar 

  22. Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent in function space. In: Proceedings of NIPS, vol. 12, pp. 512–518 (1999)

    Google Scholar 

  23. Montiel, J., Read, J., Bifet, A., Abdessalem, T.: Scikit-multiflow: a multi-output streaming framework. J. Mach. Learn. Res. 19, 72:1–72:5 (2018)

    Google Scholar 

  24. Natekin, A., Knoll, A.: Gradient boosting machines, a tutorial. Front. Neurorobot. 7, 21 (2013)

    Article  Google Scholar 

  25. Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)

    Google Scholar 

  26. Quinlan, J.R.: C4.5: Programs for Machine Learning. Elsevier, Amsterdam (2014)

    Google Scholar 

  27. Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5(2), 197–227 (1990)

    Google Scholar 

  28. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)

    Article  MathSciNet  Google Scholar 

  29. Umanol, M., et al.: Fuzzy decision trees by fuzzy ID3 algorithm and its application to diagnosis systems. In: Fuzz-IEEE, pp. 2113–2118. IEEE (1994)

    Google Scholar 

  30. Vaishali, R., Sasikala, R., Ramasubbareddy, S., Remya, S., Nalluri, S.: Genetic algorithm based feature selection and MOE fuzzy classification algorithm on Pima Indians diabetes dataset. In: ICCNI, pp. 1–5. IEEE (2017)

    Google Scholar 

  31. Vapnik, V.: Principles of risk minimization for learning theory. In: NeurIPS, pp. 831–838 (1992)

    Google Scholar 

  32. Wang, K., Lu, J., Liu, A., Zhang, G., Xiong, L.: Evolving gradient boost: a pruning scheme based on loss improvement ratio for learning under concept drift. IEEE Trans. Cybern. (2021)

    Google Scholar 

  33. Yuan, Y., Shaw, M.J.: Induction of fuzzy decision trees. Fuzzy Sets Syst. 69(2), 125–139 (1995)

    Article  MathSciNet  Google Scholar 

  34. Zadeh, L.A.: Fuzzy sets. In: Fuzzy Sets, Fuzzy Logic, and Fuzzy Systems: Selected Papers, pp. 394–432. World Scientific (1996)

    Google Scholar 

  35. Zhong, L., Fang, Z., Liu, F., Lu, J., Yuan, B., Zhang, G.: How does the combined risk affect the performance of unsupervised domain adaptation approaches? In: AAAI, pp. 11079–11087. AAAI Press (2021)

    Google Scholar 

Download references

Acknowledgment

The work presented in this paper was supported by the Australian Research Council (ARC) under Discovery Project DP190101733.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Z., Liu, A., Zhang, G., Lu, J. (2022). An Empirical Study of Fuzzy Decision Tree for Gradient Boosting Ensemble. In: Long, G., Yu, X., Wang, S. (eds) AI 2021: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13151. Springer, Cham. https://doi.org/10.1007/978-3-030-97546-3_58

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-97546-3_58

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-97545-6

  • Online ISBN: 978-3-030-97546-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics