Abstract
It is a widely accepted fact that no single Machine Learning System (MLS) gets the smaller classification error on all data sets. Different algorithms fit better to certain problems and it is interesting to combine them in some way to improve the overall accuracy. In this paper, we propose a method to construct a new MLS from given ones. It is based on the selection of the system that will perform better on a particular data set. We study several ways of selecting the systems and carry out experiments with well-known MLS on the Holte data set.
The research reported in this paper has been supported in part under MCyT and Feder grant TIC2001-3579
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Cohen, P.R.: Empirical Methods for Artificial Intelligence. MIT Press. (1995)
Everitt, B.S.: The analysis ofc ontingency tables. Chapman and Hall, London. (1977)
Snedecor, G.W., Cochran, W.G.: Statistical Methods. Iowa State University Press, Ames, IA, 8th edition. (1989)
Dietterich, T.G.: Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation. (1998) 10(7):1895–1923
Pavel, B., Brazdil, Soares, C.: A comparison of ranking methods for classification algorithm selection. In Proceedings of the 11th European Conference on Machine Learning (ECML-2000) Barcelona, Spain. Springer-Verlag. (2000) 63–74
Quinlan, J.R.: Combining instance-based and model-based learning. In Machine Learning: Proceedings oft he Tenth International Conference, Amherst, Massachusetts. Morgan Kaufmann. (1993) 236–243
Quevedo, J.R., Bahamonde, A.: Aprendizaje de funciones usando inducción sobre clasificaciones discretas. Proceedings of CAEPIA’99-TTIA’99-VIII Conferencia de la Asociación Española para la Inteligencia Artificial— III Jornadas de Transferencia Tecnológica de Inteligencia Artificial, Murcia, Spain. (1999) 64–71
Fürnkranz, J.: Round Robin Classification. Journal of Machine Learning Research. (2002) 2:721–747
Kohavi, R., John, G., Long, R., Manley, D., Pfleger, K.: MLC++: A machine learning library in C++. IEEE Computer Society Press. In Proc. of the 6th International Conference on Tools with Artificial Intelligence. (1994) 740–743
Holte, R.C.: Very simple classification rules perform well on most commonly used datasets. Machine Learning. (1993) 11:63–91
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann. (1993)
Domingos, P.: Unifying Instance-Based and Rule-Based Induction. Machine Learning. (1996) 24:141–168
Cohen, W.W.: Fast Effective Rule Induction. Proceedings of the 12th International Conference on Machine Learning (ML95), Morgan Kaufmann, San Francisco. (1995) 115–123
Murthy, S.K., S. Kasif, S. Salzberg.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research. (1994) 2:1–33
Aha, D.W., Kibler, D., Albert, M.K.: Instance based learning algorithms. Machine Learning, Vol. 6. (1991) 37–66
Wilson, D., Martinez, T.: Improved heterogeneous distance functions. Journal of Artificial Intelligence Research. (1997) 6:1–34
Kohavi, R.: Wrappers for performance enhancement and oblivious decision graphs. Ph.D. thesis, Stanford University. (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Quevedo, J.R., Combarro, E.F., Bahamonde, A. (2003). Choosing among algorithms to improve accuracy. In: Mira, J., Álvarez, J.R. (eds) Computational Methods in Neural Modeling. IWANN 2003. Lecture Notes in Computer Science, vol 2686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44868-3_32
Download citation
DOI: https://doi.org/10.1007/3-540-44868-3_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40210-7
Online ISBN: 978-3-540-44868-6
eBook Packages: Springer Book Archive