Skip to main content

An Advanced Random Forest Algorithm Targeting the Big Data with Redundant Features

  • Conference paper
  • First Online:
Book cover Algorithms and Architectures for Parallel Processing (ICA3PP 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10393))

Abstract

Recently, methods to big data are gaining a growing number of popularity, as we are entering the age of big data. As a result, novel methods keep emerging, among which stands random forest method. Random forest fuses multiple sub decision trees for classification and regression, with high accuracy and generalization. It, however, has unsatisfactory performance when facing data sets with more noise and redundant features. This phenomenon is mainly caused by inaccuracy from some sub decision trees, and fusing all of them directly cannot decrease their negative effect. Therefore, we proposed advanced random forest to assign less probability to those negative sub decision trees, meaning they are less likely to be chosen at fusion process. Thus, the capability of prediction is improved. Dropout and roulette method we used in the process ensures a good generalization capability, and maintains a higher accuracy simultaneously. We sample the original data set following the method of K-fold division which will increase the differences between sub decision trees, making the prediction more credible. Finally, our proposed method is validated on several data sets. Experimental results show that compared to traditional random forest method, our method has higher classification accuracy on data sets with noise and data sets with more redundant features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mayer-Schönberger, V., Cukier, K.: Big Data: A Revolution that will Transform How we Live, Work and Think. Eamon Dolan/Houghton Mifflin Harcourt, Boston (2013)

    Google Scholar 

  2. Chen, M., Mao, S., Liu, Y.: Big data: a survey. Mob. Netw. Appl. 19(2), 171–209 (2014)

    Article  Google Scholar 

  3. Tsai, C-W., Lai, C-F., Chao, H., Vasilakos, A.V.: Big data technologies and applications. In: Big Data Analytics, pp. 13–52 (2016)

    Google Scholar 

  4. Sowmya, R., Suneetha, K.R.: Data mining with big data. In: 2017 11th International Conference on Intelligent Systems and Control (ISCO). IEEE (2017)

    Google Scholar 

  5. Chen, T., Guestrin, C.: XGBoost: A Scalable Tree Boosting System. ArXiv e-prints (2016)

    Google Scholar 

  6. Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20, 273–297 (1995)

    MATH  Google Scholar 

  7. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)

    Article  Google Scholar 

  8. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  9. Witten, I.H., Frank, E., Hall, M.A., et al.: Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, Burlington (2016)

    Google Scholar 

  10. Biau, G., Scornet, E.: Rejoinder on: a random forest guided tour. TEST 25(2), 264–268 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  11. Jabbar, M.A., Deekshatulu, B.L., Chandra, P.: Prediction of heart disease using random forest and feature subset selection. In: Snášel, V., Abraham, A., Krömer, P., Pant, M., Muda, A.K. (eds.) Innovations in Bio-Inspired Computing and Applications. Advances in Intelligent Systems and Computing, pp. 187–196. (2015)

    Google Scholar 

  12. Gondek, C., Hafner, D., Sampson, O.R.: Prediction of failures in the air pressure system of scania trucks using a random forest and feature engineering. In: Boström, H., Knobbe, A., Soares, C., Papapetrou, P. (eds.) IDA 2016. LNCS, vol. 9897, pp. 398–402. Springer, Cham (2016). doi:10.1007/978-3-319-46349-0_36

    Chapter  Google Scholar 

  13. Shatnawi, M., Zaki, N., Yoo, P.D.: Protein inter-domain linker prediction using random forest and amino acid physiochemical properties. BMC Bioinform. 15, S8 (2014)

    Article  Google Scholar 

  14. Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A.: Real-time human pose recognition in parts from single depth images. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1297–1304 (2011)

    Google Scholar 

  15. Rodriguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 28, 1619–1630 (2006)

    Article  Google Scholar 

  16. Zhang, C.X., Zhang, J.S.: RotBoost: a technique for combining rotation forest and adaBoost. Pattern Recogn. Lett. 29, 1524–1536 (2008)

    Article  Google Scholar 

  17. Maudes, J., Rodríguez, J.J., García-Osorio, C., et al.: Random feature weights for decision tree ensemble construction. Inf. Fusion 13, 20–30 (2012)

    Article  Google Scholar 

  18. Ishwaran, H., Kogalur, U.B., Blackstone, E.H., et al.: Random survival forests. Ann. Appl. Stat. 2, 841–860 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  19. Zhang, L., Suganthan, P.N.: Random forests with ensemble of feature spaces. Pattern Recogn. 47, 3429–3437 (2014)

    Article  Google Scholar 

  20. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580 (2012)

  21. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgement

This work has been supported by the National Natural Science Foundation of China (61372068, 61672410), the National Key Research and Development Program of China (grant 2016YFB0800704), and is also supported by the ISN State Key Laboratory.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Song .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Zhang, Y., Song, B., Zhang, Y., Chen, S. (2017). An Advanced Random Forest Algorithm Targeting the Big Data with Redundant Features. In: Ibrahim, S., Choo, KK., Yan, Z., Pedrycz, W. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2017. Lecture Notes in Computer Science(), vol 10393. Springer, Cham. https://doi.org/10.1007/978-3-319-65482-9_49

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-65482-9_49

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-65481-2

  • Online ISBN: 978-3-319-65482-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics