Skip to main content

Bagging Random Trees for Estimation of Tissue Softness

  • Conference paper
Machine Learning and Data Mining in Pattern Recognition (MLDM 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3587))

  • 2149 Accesses

Abstract

We present an ensemble of classifiers that can be used to predict quality characteristics of an important process in pulp and paper industry: the tissue softness estimation. This classification problem is a difficult one since, with respect to our data set, the accuracy of all the well-known classifiers is below 68%. Contrary to that, the bagging random trees ensemble model is able to increase the accuracy up to 75%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Aha, D.: Lazy Learning. Kluwer Academic Publishers, Dordrecht (1997)

    MATH  Google Scholar 

  2. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)

    Article  Google Scholar 

  3. Breiman, L.: Bagging Predictors. Machine Learning 24(3), 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  4. Breiman, L.: Random Forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  5. Burges, C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 1–47 (1998)

    Article  Google Scholar 

  6. Corboy, W.G.: Yankee dryers. In: Thorp, B.A. (ed.) Paper machine operations. Pulp and paper manufacture, vol. 7, Joint Textbook Committee of the Paper Industry, GA (1991)

    Google Scholar 

  7. Domingos, P., Pazzani, M.: On the optimality of the simple Bayesian classifier under zero-one loss. Machine Learning 29, 103–130 (1997)

    Article  MATH  Google Scholar 

  8. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proceedings: ICML1996, pp. 148–156 (1996)

    Google Scholar 

  9. Friedman, J.H., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. The Annals of Statistics 28, 337–374 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  10. Furnkranz, J.: Separate-and-Conquer Rule Learning. Artificial Intelligence Review 13, 3–54 (1999)

    Article  Google Scholar 

  11. Landwehr, N., Hall, M., Frank, E.: Logistic Model Trees. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 241–252. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  12. Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles using Artificial Training Examples. In: Proc. of the IJCAI 2003, Mexico, August 2003, pp. 505–510 (2003)

    Google Scholar 

  13. Mitchell, T.: Machine Learning. McGraw Hill, New York (1997)

    MATH  Google Scholar 

  14. Murthy, S.: Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey. Data Mining and Knowledge Discovery 2, 345–389 (1998)

    Article  MathSciNet  Google Scholar 

  15. Nadeau, C., Bengio, Y.: Inference for the Generalization Error. Machine Learning 52(3), 239–281 (2003)

    Article  MATH  Google Scholar 

  16. Platt, J.: Using sparseness and analytic QP to speed training of support vector machines. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Advances in neural information processing systems, MA, vol. 11. MIT Press, Cambridge (1999)

    Google Scholar 

  17. Quinlan, J.R.: C4.5: Programs for machine learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  18. Tsekouras, G., Sarimveis, H., Raptis, C., Bafas, G.: A fuzzy logic approach for the classification of product qualitative characteristics. Computers and Chemical Engineering 26, 429–438 (2002)

    Article  Google Scholar 

  19. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Mateo (2000)

    Google Scholar 

  20. Webb, G.I.: MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning 40, 159–196 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E. (2005). Bagging Random Trees for Estimation of Tissue Softness. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_67

Download citation

  • DOI: https://doi.org/10.1007/11510888_67

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26923-6

  • Online ISBN: 978-3-540-31891-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics