Skip to main content

DRFLogitBoost: A Double Randomized Decision Forest Incorporated with LogitBoosted Decision Stumps

  • Conference paper
Intelligent Information and Database Systems (ACIIDS 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7196))

Included in the following conference series:

  • 1655 Accesses

Abstract

In this paper, a hybrid decision forest is constructed by double randomization of the original training set. In this decision forest, each individual base decision tree classifiers are incorporated with an additional classifier model, the Logitboosted decision stump. In the first randomization, the resamples to train the decision trees are extracted; in the second randomization, second set of resamples are generated from the out-of-bag samples of the first set of resamples. The boosted decision stumps are constructed on the second resamples. These extra resamples along with the resamples on which the base tree classifiers are trained, approximates the original training set. In this way we are utilizing the full training set to construct a hybrid decision forest with larger feature space. We have applied this hybrid decision forest in two real world applications; a) classifying credit scores, and b) short term extreme rainfall forecast. The performance of the hybrid decision forest in these two problems are compared with some well known machine learning methods. Overall results suggest that the new hybrid decision forest is capable of yielding commendable predictive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging Predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  2. Breiman, L.: Out-of-bag estimation. Tech. Rep. 2 (1996)

    Google Scholar 

  3. Breiman, L.: Random Forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  4. De Bock, K.W., Coussement, K., Van den Poel, D.: Ensemble classification based on generalized additive models. Computational Statistics & Data Analysis 54(6), 1535–1546 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dettling, M., Buhlmann, P.: Boosting for tumor classification with gene expression data (June 2003)

    Google Scholar 

  6. Fawcett, T.: An introduction to ROC analysis. Pattern Recognition Letters 27(8), 861–874 (2006)

    Article  MathSciNet  Google Scholar 

  7. Fawcett, T.: ROC Graphs: Notes and Practical Considerations for Researchers. ReCALL 31(HPL-2003-4), 1–38 (2004)

    Google Scholar 

  8. Frank, A., Asuncion, A.: UCI Machine Learning Repository (2010), http://archive.ics.uci.edu/ml

  9. Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995)

    Chapter  Google Scholar 

  10. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Annals of Statistics 28(2), 337–407 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  11. Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference and prediction, 2nd edn. Springer, Heidelberg (2009)

    Book  MATH  Google Scholar 

  12. Hogan, R.J., O’Connor, E.J., Illingworth, A.J.: Verification of cloud-fraction forecasts. Quarterly Journal of the Royal Meteorological Society 135(643), 1494–1511 (2009)

    Article  Google Scholar 

  13. Hothorn, T., Lausen, B.: Double-bagging: combining classifiers by bootstrap aggregation. Pattern Recognition 36(6), 1303–1309 (2003)

    Article  MATH  Google Scholar 

  14. Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters, 293–300 (1999)

    Google Scholar 

  15. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms, 1st edn. Wiley-Interscience (2004)

    Google Scholar 

  16. Mason, I.: A model for assessment of weather forecasts. Australian Metereological Magazine 30, 291–303 (1982)

    Google Scholar 

  17. Nanni, L., Lumini, A.: An experimental comparison of ensemble of classifiers for bankruptcy prediction and credit scoring. Expert Systems with Applications 36(2), 3028–3033 (2009)

    Article  Google Scholar 

  18. Polikar, R.: Ensemble based systems in decision making. IEEE Circuits And Systems Magazine Circuits And Systems Magazine 6(3), 21–45 (2006)

    Article  Google Scholar 

  19. Rodríguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)

    Article  Google Scholar 

  20. Rokach, L.: Pattern classification using ensemble methods. World Scientific Publishing (2010)

    Google Scholar 

  21. Schapire, R.E.: The Boosting Approach to Machine Learning An Overview. In: MSRI Workshop on Nonlinear Estimation and Classification, vol. 7(4), pp. 1–23 (2003)

    Google Scholar 

  22. Stephenson, D.B., Casati, B., Ferro, C.A.T., Wilson, C.A.: The extreme dependency score: a non-vanishing measure for forecasts of rare events. Meteorological Applications 15(1), 41–50 (2008)

    Article  Google Scholar 

  23. Stephenson, D.: Use of the “Odds Ratio” for Diagnosing Forecast Skill. Weather Forecasting 15(2), 221–232 (2000)

    Article  Google Scholar 

  24. Tsai, C., Wu, J.: Using neural network ensembles for bankruptcy prediction and credit scoring. Expert Systems with Applications 34(4), 2639–2649 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Faisal, Z.M., Monira, S.S., Hirose, H. (2012). DRFLogitBoost: A Double Randomized Decision Forest Incorporated with LogitBoosted Decision Stumps. In: Pan, JS., Chen, SM., Nguyen, N.T. (eds) Intelligent Information and Database Systems. ACIIDS 2012. Lecture Notes in Computer Science(), vol 7196. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28487-8_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-28487-8_30

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-28486-1

  • Online ISBN: 978-3-642-28487-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics