Skip to main content

Hardware/Algorithm Co-optimization for Fully-Parallelized Compact Decision Tree Ensembles on FPGAs

  • Conference paper
  • First Online:
Applied Reconfigurable Computing. Architectures, Tools, and Applications (ARC 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12083))

Included in the following conference series:

Abstract

Decision tree ensembles, such as random forests, are well-known classification and regression methods with high accuracy and robustness, especially for categorical data that combines multiple weak learners called decision trees. We propose an architecture/algorithm co-design method for implementing fully parallelized fast decision tree ensembles on FPGAs. The method first produces compact and almost equivalent representations of original input decision trees by threshold compaction. For each input feature, comparisons with similar thresholds are merged into fewer variations, so the number of comparisons is reduced. The decision tree with merged thresholds is perfectly extracted as hard-wired logic for the highest throughput. In this study, we developed a prototype hardware synthesis compiler that generates a Verilog hardware description language (HDL) description from a compressed representation. The experiment successfully demonstrates that the proposed method reduces the sizes of generated hardware without accuracy degradation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Andrzejak, R.G., Lehnertz, K., Mormann, F., Rieke, C., David, P., Elger, C.E.: Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys. Rev. E 64, 061907 (2001). https://link.aps.org/doi/10.1103/PhysRevE.64.061907

    Article  Google Scholar 

  2. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996). https://doi.org/10.1007/BF00058655

    Article  MATH  Google Scholar 

  3. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001). https://doi.org/10.1023/A:1010933404324

    Article  MATH  Google Scholar 

  4. Cheng, C., Bouganis, C.: Accelerating random forest training process using FPGA. In: 2013 23rd International Conference on Field Programmable Logic and Applications, pp. 1–7, September 2013

    Google Scholar 

  5. Cheng, C., Bouganis, C.: Memory optimisation for hardware induction of axis-parallel decision tree. In: 2014 International Conference on ReConFigurable Computing and FPGAs (ReConFig 2014), pp. 1–5, December 2014

    Google Scholar 

  6. Cortez, P., Cerdeira, A., Almeida, F., Matos, T., Reis, J.: Modeling wine preferences by data mining from physicochemical properties. Decis. Support Syst. 47(4), 547–553 (2009). https://doi.org/10.1016/j.dss.2009.05.016

    Article  Google Scholar 

  7. Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml

  8. Jinguji, A., Sato, S., Nakahara, H.: An FPGA realization of a random forest with k-means clustering using a high-level synthesis design. IEICE Trans. Inf. Syst. E101.D(2), 354–362 (2018)

    Article  Google Scholar 

  9. Little, M.A., McSharry, P.E., Roberts, S.J., Costello, D.A., Moroz, I.M.: Exploiting nonlinear recurrence and fractal scaling properties for voice disorder detection. BioMed. Eng. OnLine 6(1) (2007). https://doi.org/10.1186/1475-925X-6-23. Article no. 23

  10. Mansour, Y.: Pessimistic decision tree pruning based on tree size. In: ICML 1997 (1997)

    Google Scholar 

  11. Nakamura, A., Sakurada, K.: An algorithm for reducing the number of distinct branching conditions in a decision forest. In: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD) (2019)

    Google Scholar 

  12. Struharik, R.: Decision tree ensemble hardware accelerators for embedded applications. In: 2015 IEEE 13th International Symposium on Intelligent Systems and Informatics (SISY), pp. 101–106, September 2015

    Google Scholar 

  13. Struharik, R.: Implementing decision trees in hardware, September 2011

    Google Scholar 

  14. Kulkarni, V.Y., Sinha, P.K.: Pruning of random forest classifiers: a survey and future directions. In: 2012 International Conference on Data Science and Engineering (ICDSE) (2012)

    Google Scholar 

  15. Yeh, I.C., Yang, K.J., Ting, T.M.: Knowledge discovery on RFM model using Bernoulli sequence. Expert Syst. Appl. 36(3), 5866–5871 (2009). https://doi.org/10.1016/j.eswa.2008.07.018

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taiga Ikeda .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ikeda, T., Sakurada, K., Nakamura, A., Motomura, M., Takamaeda-Yamazaki, S. (2020). Hardware/Algorithm Co-optimization for Fully-Parallelized Compact Decision Tree Ensembles on FPGAs. In: Rincón, F., Barba, J., So, H., Diniz, P., Caba, J. (eds) Applied Reconfigurable Computing. Architectures, Tools, and Applications. ARC 2020. Lecture Notes in Computer Science(), vol 12083. Springer, Cham. https://doi.org/10.1007/978-3-030-44534-8_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-44534-8_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-44533-1

  • Online ISBN: 978-3-030-44534-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics