Abstract
Lazy Bayesian Rules modifies naive Bayesian classification to undo elements of the harmful attribute independence assumption. It has been shown to provide classification error comparable to boosting decision trees. This paper explores alternatives to the candidate elimination criterion employed within Lazy Bayesian Rules. Improvements over naive Bayes are consistent so long as the candidate elimination criteria ensures there is sufficient data for accurate probability estimation. However, the original candidate elimination criterion is demonstrated to provide better overall error reduction than the use of a minimum data subset size criterion.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
C. Blake and C. J. Merz. UCI repository of machine learning databases. [Machinereadable data repository]. University of California, Department of Information and Computer Science, Irvine, CA., 2001.
B. Cestnik, I. Kononenko, and I. Bratko. ASSISTANT 86: A knowledge-elicitation tool for sophisticated users. In I. Bratko and N. Lavrač, editors, Progress in Machine Learning, pp. 31–45. Sigma Press, Wilmslow, 1987.
P. Domingos and M. Pazzani. Beyond independence: Conditions for the optimality of the simple Bayesian classifier. In Proc. Thirteenth International Conference on Machine Learning, pp. 105–112, Bari, Italy, 1996. Morgan Kaufmann.
R. Duda and P. Hart. Pattern Classification and Scene Analysis. John Wiley and Sons, New York, 1973.
U. M. Fayyad and K. B. Irani. Multi-interval discretization of continuous-valued attributes for classification learning. In IJCAI-93: Proc. 13th International Joint Conference on Artificial Intelligence, pp. 1022–1027, Chambery, France, 1993. Morgan Kaufmann.
N. Friedman and M. Goldszmidt. Building classifiers using Bayesian networks. In AAAI-96, pp. 1277–1284, 1996.
R. Kohavi. Scaling up the accuracy of naive-Bayes classifiers: A decision-tree hybrid. In KDD-96, Portland, Or, 1996.
I. Kononenko. Comparison of inductive and naive Bayesian learning approaches to AUtomatic knowledge acquisition. In B. Wielinga, J. Boose, B. Gaines, G. Schreiber, and M. van Someren, editors, Current Trends in Knowledge Acquisition. IOS Press, Amsterdam, 1990.
I. Kononenko. Semi-naive Bayesian classifier. In ECAI-91, pp. 206–219, 1991.
P. Langley. Induction of recursive Bayesian classifiers. In Proc. 1993 European Conference on Machine Leanring, pp. 153–164, Vienna, 1993. Springer-Verlag.
P. Langley and S. Sage. Induction of selective Bayesian classifiers. In Proc. Tenth Conference on Uncertainty in Artificial Intelligence, pp. 399–406, Seattle, WA, 1994. Morgan Kaufmann.
M. J. Pazzani. Constructive induction of Cartesian product attributes. In ISIS: Information, Statistics and Induction in Science, pp. 66–77, Melbourne, Aust., August 1996. World Scientific.
M. Sahami. Learning limited dependence Bayesian classifiers. In Proc. 2nd International Conference on Knowledge Discovery and Data Mining, pp. 334–338. AAAI Press, 1996.
M. Singh and G. M. Provan. Efficient learning of selective Bayesian network classifiers. In Proc. 13th International Conference on Machine Learning, pp. 453–461, Bari, 1996. Morgan Kaufmann.
G. I. Webb. Multiboosting: A technique for combining boosting and wagging. Machine Learning, 40(2):159–196, 2000.
G. I. Webb and M. J. Pazzani. Adjusted probability naive Bayesian induction. In Proc. Eleventh Australian Joint Conference on Artificial Intelligence, pp. 285–295, Brisbane, Australia, 1998. Springer.
Z. Zheng and G. I. Webb. Lazy learning of Bayesian Rules. Machine Learning, 41(1):53–84, 2000.
Z. Zheng, G. I. Webb, and K. M. Ting. Lazy Bayesian Rules: A lazy semi-naive Bayesian learning technique competitive to boosting decision trees. In Proc. Sixteenth International Conference on Machine Learning(ICML-99), pp. 493–502, Bled, Slovenia, 1999. Morgan Kaufmann.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Webb, G.I. (2001). Candidate Elimination Criteria for Lazy Bayesian Rules. In: Stumptner, M., Corbett, D., Brooks, M. (eds) AI 2001: Advances in Artificial Intelligence. AI 2001. Lecture Notes in Computer Science(), vol 2256. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45656-2_47
Download citation
DOI: https://doi.org/10.1007/3-540-45656-2_47
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42960-9
Online ISBN: 978-3-540-45656-8
eBook Packages: Springer Book Archive