Abstract
We present an ensemble of classifiers that can be used to predict quality characteristics of an important process in pulp and paper industry: the tissue softness estimation. This classification problem is a difficult one since, with respect to our data set, the accuracy of all the well-known classifiers is below 68%. Contrary to that, the bagging random trees ensemble model is able to increase the accuracy up to 75%.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Aha, D.: Lazy Learning. Kluwer Academic Publishers, Dordrecht (1997)
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)
Breiman, L.: Bagging Predictors. Machine Learning 24(3), 123–140 (1996)
Breiman, L.: Random Forests. Machine Learning 45(1), 5–32 (2001)
Burges, C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 1–47 (1998)
Corboy, W.G.: Yankee dryers. In: Thorp, B.A. (ed.) Paper machine operations. Pulp and paper manufacture, vol. 7, Joint Textbook Committee of the Paper Industry, GA (1991)
Domingos, P., Pazzani, M.: On the optimality of the simple Bayesian classifier under zero-one loss. Machine Learning 29, 103–130 (1997)
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proceedings: ICML1996, pp. 148–156 (1996)
Friedman, J.H., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. The Annals of Statistics 28, 337–374 (2000)
Furnkranz, J.: Separate-and-Conquer Rule Learning. Artificial Intelligence Review 13, 3–54 (1999)
Landwehr, N., Hall, M., Frank, E.: Logistic Model Trees. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 241–252. Springer, Heidelberg (2003)
Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles using Artificial Training Examples. In: Proc. of the IJCAI 2003, Mexico, August 2003, pp. 505–510 (2003)
Mitchell, T.: Machine Learning. McGraw Hill, New York (1997)
Murthy, S.: Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey. Data Mining and Knowledge Discovery 2, 345–389 (1998)
Nadeau, C., Bengio, Y.: Inference for the Generalization Error. Machine Learning 52(3), 239–281 (2003)
Platt, J.: Using sparseness and analytic QP to speed training of support vector machines. In: Kearns, M.S., Solla, S.A., Cohn, D.A. (eds.) Advances in neural information processing systems, MA, vol. 11. MIT Press, Cambridge (1999)
Quinlan, J.R.: C4.5: Programs for machine learning. Morgan Kaufmann, San Francisco (1993)
Tsekouras, G., Sarimveis, H., Raptis, C., Bafas, G.: A fuzzy logic approach for the classification of product qualitative characteristics. Computers and Chemical Engineering 26, 429–438 (2002)
Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Mateo (2000)
Webb, G.I.: MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning 40, 159–196 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E. (2005). Bagging Random Trees for Estimation of Tissue Softness. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_67
Download citation
DOI: https://doi.org/10.1007/11510888_67
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26923-6
Online ISBN: 978-3-540-31891-0
eBook Packages: Computer ScienceComputer Science (R0)