Skip to main content

Candidate Elimination Criteria for Lazy Bayesian Rules

  • Conference paper
  • First Online:
AI 2001: Advances in Artificial Intelligence (AI 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2256))

Included in the following conference series:

Abstract

Lazy Bayesian Rules modifies naive Bayesian classification to undo elements of the harmful attribute independence assumption. It has been shown to provide classification error comparable to boosting decision trees. This paper explores alternatives to the candidate elimination criterion employed within Lazy Bayesian Rules. Improvements over naive Bayes are consistent so long as the candidate elimination criteria ensures there is sufficient data for accurate probability estimation. However, the original candidate elimination criterion is demonstrated to provide better overall error reduction than the use of a minimum data subset size criterion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. Blake and C. J. Merz. UCI repository of machine learning databases. [Machinereadable data repository]. University of California, Department of Information and Computer Science, Irvine, CA., 2001.

    Google Scholar 

  2. B. Cestnik, I. Kononenko, and I. Bratko. ASSISTANT 86: A knowledge-elicitation tool for sophisticated users. In I. Bratko and N. Lavrač, editors, Progress in Machine Learning, pp. 31–45. Sigma Press, Wilmslow, 1987.

    Google Scholar 

  3. P. Domingos and M. Pazzani. Beyond independence: Conditions for the optimality of the simple Bayesian classifier. In Proc. Thirteenth International Conference on Machine Learning, pp. 105–112, Bari, Italy, 1996. Morgan Kaufmann.

    Google Scholar 

  4. R. Duda and P. Hart. Pattern Classification and Scene Analysis. John Wiley and Sons, New York, 1973.

    MATH  Google Scholar 

  5. U. M. Fayyad and K. B. Irani. Multi-interval discretization of continuous-valued attributes for classification learning. In IJCAI-93: Proc. 13th International Joint Conference on Artificial Intelligence, pp. 1022–1027, Chambery, France, 1993. Morgan Kaufmann.

    Google Scholar 

  6. N. Friedman and M. Goldszmidt. Building classifiers using Bayesian networks. In AAAI-96, pp. 1277–1284, 1996.

    Google Scholar 

  7. R. Kohavi. Scaling up the accuracy of naive-Bayes classifiers: A decision-tree hybrid. In KDD-96, Portland, Or, 1996.

    Google Scholar 

  8. I. Kononenko. Comparison of inductive and naive Bayesian learning approaches to AUtomatic knowledge acquisition. In B. Wielinga, J. Boose, B. Gaines, G. Schreiber, and M. van Someren, editors, Current Trends in Knowledge Acquisition. IOS Press, Amsterdam, 1990.

    Google Scholar 

  9. I. Kononenko. Semi-naive Bayesian classifier. In ECAI-91, pp. 206–219, 1991.

    Google Scholar 

  10. P. Langley. Induction of recursive Bayesian classifiers. In Proc. 1993 European Conference on Machine Leanring, pp. 153–164, Vienna, 1993. Springer-Verlag.

    Google Scholar 

  11. P. Langley and S. Sage. Induction of selective Bayesian classifiers. In Proc. Tenth Conference on Uncertainty in Artificial Intelligence, pp. 399–406, Seattle, WA, 1994. Morgan Kaufmann.

    Google Scholar 

  12. M. J. Pazzani. Constructive induction of Cartesian product attributes. In ISIS: Information, Statistics and Induction in Science, pp. 66–77, Melbourne, Aust., August 1996. World Scientific.

    Google Scholar 

  13. M. Sahami. Learning limited dependence Bayesian classifiers. In Proc. 2nd International Conference on Knowledge Discovery and Data Mining, pp. 334–338. AAAI Press, 1996.

    Google Scholar 

  14. M. Singh and G. M. Provan. Efficient learning of selective Bayesian network classifiers. In Proc. 13th International Conference on Machine Learning, pp. 453–461, Bari, 1996. Morgan Kaufmann.

    Google Scholar 

  15. G. I. Webb. Multiboosting: A technique for combining boosting and wagging. Machine Learning, 40(2):159–196, 2000.

    Article  Google Scholar 

  16. G. I. Webb and M. J. Pazzani. Adjusted probability naive Bayesian induction. In Proc. Eleventh Australian Joint Conference on Artificial Intelligence, pp. 285–295, Brisbane, Australia, 1998. Springer.

    Google Scholar 

  17. Z. Zheng and G. I. Webb. Lazy learning of Bayesian Rules. Machine Learning, 41(1):53–84, 2000.

    Article  Google Scholar 

  18. Z. Zheng, G. I. Webb, and K. M. Ting. Lazy Bayesian Rules: A lazy semi-naive Bayesian learning technique competitive to boosting decision trees. In Proc. Sixteenth International Conference on Machine Learning(ICML-99), pp. 493–502, Bled, Slovenia, 1999. Morgan Kaufmann.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Webb, G.I. (2001). Candidate Elimination Criteria for Lazy Bayesian Rules. In: Stumptner, M., Corbett, D., Brooks, M. (eds) AI 2001: Advances in Artificial Intelligence. AI 2001. Lecture Notes in Computer Science(), vol 2256. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45656-2_47

Download citation

  • DOI: https://doi.org/10.1007/3-540-45656-2_47

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42960-9

  • Online ISBN: 978-3-540-45656-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics