Abstract
It is common in Machine Learning where rules are learned from examples that some of them could not be informative, otherwise they could be irrelevant or noisy. This type of examples makes the Machine Learning Systems produce not adequate rules. In this paper we present an algorithm that filters noisy continuous labeled examples, whose computational cost is O(N·logN+NA 2) for N examples and A attributes. Besides, it is shown experimentally to be better than the embedded algorithms of the state-of-the art of the Machine Learning Systems.
The research reported in this paper has been supported in part under MCyT and Feder grant TIC2001-3579
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Aha, D.W., Kibler, D., Albert, M.K.: Instance based learning algorithms. Machine Learning, Vol. 6. (1991) 37–66
Aha, D.W.: Lazy learning. Kluwer Academic Publishers, Dordrecht. (1997)
Blum A.L.: Relevant examples and relevant features: Thoughts from computational learning theory. In AAAI Fall Symposium on ‘Relevance’. (1994) 31
Blum A.L., Langley. P.:Selection of relevant features and examples in machine learning. Artificial Intelligence. (1997) 245–271
Cameron-Jones, R.M.: Instance Selection by encoding length heuristic with random mutation hill climbing. IEEE Proc. of the 8th Australian Joint Conference on AI. World Scientific. (1995) 99–106.
Kohavi, R., John, G., Long, R., Manley, D., & Pfleger, K. (1994). MLC++: A machine learning library in C++. In Proc. of the 6th International Conference on Tools with Artificial Intelligence, 740–743. IEEE Computer Society Press.
Quinlan, J.R.: Learning with continuous classes. In Proc. 5th Australian Joint Conference on Artificial Intelligence. World Scientific, Singapore, (1992) 343–348.
Quinlan, J.R.: Cubist. http://www.rulequest.com/cubist-info.html
Torgo. L.: Functional models for regression tree leaves. In Proc. of the 14th International Conference on Machine Learning, Nashville, TN. Morgan Kaufmann. (1997) 385–393
Torgo. L: Regression Data Sets Repository at LIACC (University of Porto). http://www.ncc.up.pt/~ltorgo/Regression/DataSets.html
Wang, Y., and Witten, I.H.. Inducing model trees for continuous classes. In Poster Papers 9th European Conf. on Machine Learning. Prague, Czech Republic. (1997) 128–137.
Weiss, S. M., Kulikowski, C. A.: Computer systems that learn: Classification and prediction methods from statistics, neural nets, machine learning, and expert systems. MorganKaufmann, San Mateo, CA, (1991.)
Wettschereck, D., Dietterich, T. G.: Locally adaptive nearest neighbor algorithms in Advances of Neural Information Processing Systems 6. Morgan Kaufmann Publishers. (1994) 184–191
Wilson, D.R., Martinez, T.R.: Instance pruning techniques. Proc. of the 14th International Conference on Machine Learning. Morgan Kaufmann, Nashville, TN., (1997) 403–411
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ramón Quevedo Pérez, J., Dolores García, M., Montañés, E. (2002). Filtering Noisy Continuous Labeled Examples. In: Garijo, F.J., Riquelme, J.C., Toro, M. (eds) Advances in Artificial Intelligence — IBERAMIA 2002. IBERAMIA 2002. Lecture Notes in Computer Science(), vol 2527. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36131-6_6
Download citation
DOI: https://doi.org/10.1007/3-540-36131-6_6
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00131-7
Online ISBN: 978-3-540-36131-2
eBook Packages: Springer Book Archive