Abstract
Local methods have significant advantages when the probability measure defined on the space of symbolic objects for each class is very complex, but can still be described by a collection of less complex local approximations. We propose a technique of local bagging of decision stumps. We performed a comparison with other well known combining methods using the same base learner, on standard benchmark datasets and the accuracy of the proposed technique was greater in most cases.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Aha, D.: Lazy Learning. Kluwer Academic Publishers, Dordrecht (1997)
Atkeson, C.G., Moore, A.W., Schaal, S.: Locally weighted learning for control. Artificial Intelligence Review 11, 75–113 (1997)
Atkeson, C.G., Moore, A.W., Schaal, S.: Locally weighted learning. Artificial Intelligence Review 11, 11–73 (1997)
Blake, C., Merz, C.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1998), http://www.ics.uci.edu/mlearn/MLRepository.html
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning 36, 525–536 (1999)
Brighton, H., Mellish, C.: Advances in Instance Selection for Instance-Based Learning Algorithms. Data Mining and Knowledge Discovery 6, 153–172 (2002)
Bottou, L., Vapnik, V.: Local learning algorithm. Neural Computation 4(6), 888–901 (1992)
Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)
Cohen, S., Intrator, N.: Automatic model selection in a hybrid perceptron/Radial network. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, pp. 349–358. Springer, Heidelberg (2001)
John, C., Trigg, L.: An Instance- based Learner Using an Entropic Distance Measure. In: Proc. of the 12th International Conference on ML, pp. 108–114 (1995)
Frank, E., Hall, M., Pfahringer, B.: Locally weighted naive Bayes. In: Proc. of the 19th Conference on Uncertainty in Artificial Intelligence, Acapulco, Mexico. Morgan Kaufmann, San Francisco (2003)
Freund, Y., Schapire, R.: Experiments with a New Boosting Algorithm. In: Proc. ICML 1996, pp. 148–156 (1996)
Friedman, J.H., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. The Annals of Statistics 28, 337–374 (2000)
Iba, W., Langley, P.: Induction of one-level decision trees. In: Proc. of the Ninth International Machine Learning Conference. Morgan Kaufmann, Aberdeen (1992)
Kleinberg, E.M.: A mathematically rigorous foundation for supervised learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 67–76. Springer, Heidelberg (2000)
Nadeau, C., Bengio, Y.: Inference for the Generalization Error. Machine Learning 52, 239–281 (2003)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Wilson, D., Martinez, T.: Reduction Techniques for Instance-Based Learning Algorithms. Machine Learning 38, 257–286 (2000)
Witten, I., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Mateo (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E. (2005). Local Bagging of Decision Stumps. In: Ali, M., Esposito, F. (eds) Innovations in Applied Artificial Intelligence. IEA/AIE 2005. Lecture Notes in Computer Science(), vol 3533. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11504894_57
Download citation
DOI: https://doi.org/10.1007/11504894_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26551-1
Online ISBN: 978-3-540-31893-4
eBook Packages: Computer ScienceComputer Science (R0)