Skip to main content

Better Induction Models for Classification of Forest Cover

  • Conference paper
Mobile, Ubiquitous, and Intelligent Computing

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 274))

  • 2682 Accesses

Abstract

The data set of forest cover types based on cartographic data consists of very large data set of 581,102 instances. So, decision tree-based data mining methods that need relatively less computing resources could be used for better classification models. Random forests consisting of multitude of special decision trees are known to be a good data mining tool, and a technique based on grid search of random forests was investigated to find very accurate classifier. Experiments showed that a classifier of high accuracy could be found for the data set of forest cover types.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Calaway, R., Edlefsen, L., Gong, L.: Big Data Decision Trees with R, Revolution Analytics White Paper (2012), http://www.revolutionanalytics.com/why-revolution-r/whitepapers/RevoScaleRDecisionTrees.pdf

  2. Quinlan, J.: Programs for Machine Learning. Morgan Kaufmann Publishers, Inc. (1993)

    Google Scholar 

  3. Breiman, L., Friedman, J., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth International Group, Inc. (1984)

    Google Scholar 

  4. Wu, X., Kumar, V., Quinlan, J., Ghosh, J., Yang, Q., Motoda, H., McLachlan, G.J., Ng, A., Liu, B., Yu, P.S., Zhou, Z., Steinbach, M., Hand, D.J., Steinberg, D.: Top 10 Algorithms in Data Mining. Knowledge and Information Systems 14, 1–37 (2008)

    Article  Google Scholar 

  5. Hickey, R.J.: Structure and Majority Classes in Decision Tree Learning. Journal of Machine Learning Research 8, 1747–1768 (2007)

    MathSciNet  MATH  Google Scholar 

  6. Ludermir, T.B., Yamazaki, A., Zanchettin, C.: An Optimization Methodology for Neural Network Weights and Architectures. IEEE Transactions on Neural Networks 17(6), 1452–1459 (2006)

    Article  Google Scholar 

  7. Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  8. Breiman, L.: Random Forests. Machine Learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  9. Statnikov, A., Wang, L., Aliferis, C.F.: A Comprehensive Comparison of Random Forests and Support Vector Machines for Microarray-based Cancer Classification. BMC Bioinformatics 9 (2008)

    Google Scholar 

  10. Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  11. Blackard, J.A., Dean, D.J.: Comparative Accuracies of Artificial Neural Networks and Discriminant Analysis in Predicting Forest Cover Types from Cartographic Variables. Computer and Electronics in Agriculture 24, 131–151 (1999)

    Article  Google Scholar 

  12. Li, X.: A Scalable Decision Tree System and its Application in Pattern Recognition and Intrusion Detection. Decision Support Systems 41, 112–130 (2005)

    Article  MATH  Google Scholar 

  13. Bo, L., Wang, L., Jiao, L.: Training Hard-margin Support Vector Machines Using Greedy Stagewise Algorithm. IEEE Transactions on Neural Networks 19(8), 1446–1455 (2008)

    Article  Google Scholar 

  14. Trebar, T., Steele, N.: Application of Distributed SVM Architectures in Classifying Forest Cover Types. Computers and Electronics in Agriculture 63, 119–130 (2008)

    Article  Google Scholar 

  15. Moore, D., McCabe, G., Duckworth, W.M., Alwan, L.: The Practice of Business Statics: Using Data for Decisions, 2nd edn. W.H. Freeman (2008)

    Google Scholar 

  16. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA Data Mining Software: An Update. SIGKDD Explorations 11(1), 10–18 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hyontai Sug .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sug, H. (2014). Better Induction Models for Classification of Forest Cover. In: Park, J., Adeli, H., Park, N., Woungang, I. (eds) Mobile, Ubiquitous, and Intelligent Computing. Lecture Notes in Electrical Engineering, vol 274. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40675-1_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40675-1_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40674-4

  • Online ISBN: 978-3-642-40675-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics