skip to main content
10.1145/1553374.1553414acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
research-article

GAODE and HAODE: two proposals based on AODE to deal with continuous variables

Published: 14 June 2009 Publication History

Abstract

AODE (Aggregating One-Dependence Estimators) is considered one of the most interesting representatives of the Bayesian classifiers, taking into account not only the low error rate it provides but also its efficiency. Until now, all the attributes in a dataset have had to be nominal to build an AODE classifier or they have had to be previously discretized. In this paper, we propose two different approaches in order to deal directly with numeric attributes. One of them uses conditional Gaussian networks to model a dataset exclusively with numeric attributes; and the other one keeps the superparent on each model discrete and uses univariate Gaussians to estimate the probabilities for the numeric attributes and multinomial distributions for the categorical ones, it also being able to model hybrid datasets. Both of them obtain competitive results compared to AODE, the latter in particular being a very attractive alternative to AODE in numeric datasets.

References

[1]
Alpaydin, E. (1999). Combined 5 x 2 cv f test for comparing supervised classification learning algorithms. Neural Comput., 11, 1885--1892.
[2]
Andersen, S. K., Olesen, K. G., Jensen, F. V., & Jensen, F. (1989). HUGIN-A shell for building Bayesian belief universes for expert systems. Proc. of the 11th Int. Joint Conf. on AI (pp. 1080--1085).
[3]
Asuncion, A., & Newman, D. (2007). UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences. http://www.ics.uci.edu/~mlearn/MLRepository.html.
[4]
DeGroot, M. H. (1970). Optimal statistical decisions. New York: McGraw-Hill.
[5]
Demšar, J. (2006). Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res., 7, 1--30.
[6]
Domingos, P., & Pazzani, M. (1996). Beyond independence: Conditions for the optimality of the simple Bayesian classifier. Proc. of the 13th Int. Conf. on Machine Learning (pp. 105--112).
[7]
Duda, R. O., Hart, P. E., & Stork, D. G. (1973). Pattern classification and scene analysis. Wiley NY.
[8]
Fayyad, U. M., & Irani, K. B. (1993). Multi-interval discretization of continuous-valued attributes for classification learning. Proc. of the 13th Int. Joint Conf. on Articial Intelligence (pp. 1022--1027).
[9]
García, S., & Herrera, F. (2009). An extension on "statistical comparisons of classifiers over multiple data sets" for all pairwise comparisons. J. Mach. Learn. Res., 9, 2677--2694.
[10]
Geiger, D., & Heckerman, D. (1994). Learning Gaussian networks. Proc. of the 10th Annual Conf. on Uncertainty in AI (pp. 235--243).
[11]
Keogh, E., & Pazzani, M. (1999). Learning augmented Bayesian classifiers: A comparison of distribution-based and classification-based approaches. Proc. of the 7th Int. Workshop on AI and Statistics (pp. 225--230).
[12]
Langley, P., Iba, W., & Thompson, K. (1992). An analysis of Bayesian classifiers. Proc. of the 10th National Conf. on AI (pp. 223--228).
[13]
Larrañaga, P., Etxeberria, R., Lozano, J., & Peñña, J. M. (1999). Optimization by learning and simulation of Bayesian and Gaussian networks (Technical Report). University of the Basque Country.
[14]
Lauritzen, S. L., & Jensen, F. (2001). Stable local computation with conditional Gaussian distributions. Statistics and Computing, 11, 191--203.
[15]
Moral, S., Rumí, R., & Salmeróón, A. (2001). Mixtures of Truncated Exponentials in hybrid Bayesian networks. Proc. of the 6th European Conf. on Symbolic and Quantitative Approaches to Reasoning with Uncertainty (pp. 156--167).
[16]
Neapolitan, R. E. (2003). Learning Bayesian networks. Prentice Hall.
[17]
Pérez, A., Larrañaga, P., & Inza, I. (2006). Supervised classification with conditional gaussian networks: Increasing the structure complexity from naive Bayes. Int. J. Approx. Reasoning, 43, 1--25.
[18]
Sahami, M. (1996). Learning limited dependence Bayesian classifiers. Proc. of the 2nd Int. Conf. on Knowledge Discovery in Databases (pp. 335--338).
[19]
Webb, G. I., Boughton, J. R., & Wang, Z. (2005). Not So Naive Bayes: Aggregating One-Dependence Estimators. Mach. Learn., 58, 5--24.
[20]
Wek08 (2008). Collection of datasets avalaibles from the weka official homepage. http://www.cs.waikato.ac.nz/ml/weka/.
[21]
Witten, I. H., & Frank, E. (2005). Data mining: Practical machine learning tools and techniques. Morgan Kaufmann. 2 edition.
[22]
Wu, X., Kumar, V., Quinlan, J. R., Ghosh, J., Yang, Q., Motoda, H., McLachlan, G. J., Ng, A., Liu, B., Yu, P. S., Zhou, Z.-H., Steinbach, M., Hand, D. J., & Steinberg, D. (2007). Top 10 algorithms in data mining. Knowl. Inf. Syst., 14, 1--37.
[23]
Zheng, F., & Webb, G. (2005). A Comparative Study of Semi-naive Bayes Methods in Classification Learning. Proc. of the 4th Australian Data Mining Conf. (pp. 141--156).
[24]
Zheng, Z., & Webb, G. I. (2000). Lazy Learning of Bayesian Rules. Mach. Learn., 41, 53--84.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning
June 2009
1331 pages
ISBN:9781605585161
DOI:10.1145/1553374

Sponsors

  • NSF
  • Microsoft Research: Microsoft Research
  • MITACS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2009

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Funding Sources

  • FPU
  • Spanish Ministerio de Educacion y Tecnologia
  • Consejeria de Educacion y Ciencia (JCCM)

Conference

ICML '09
Sponsor:
  • Microsoft Research

Acceptance Rates

Overall Acceptance Rate 140 of 548 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Multi-dimensional Bayesian network classifiers for partial label rankingInternational Journal of Approximate Reasoning10.1016/j.ijar.2023.108950160(108950)Online publication date: Sep-2023
  • (2023)MiniAnDE: A Reduced AnDE Ensemble to Deal with Microarray DataEngineering Applications of Neural Networks10.1007/978-3-031-34204-2_12(131-143)Online publication date: 7-Jun-2023
  • (2020)Perturbation-based classifierSoft Computing10.1007/s00500-020-04960-2Online publication date: 29-May-2020
  • (2019)Universal Target Learning: An Efficient and Effective Technique for Semi-Naive Bayesian LearningEntropy10.3390/e2108072921:8(729)Online publication date: 25-Jul-2019
  • (2014)A naive Bayes probability estimation model based on self-adaptive differential evolutionJournal of Intelligent Information Systems10.1007/s10844-013-0279-y42:3(671-694)Online publication date: 1-Jun-2014
  • (2012)Supervised Classification with Bayesian NetworksIntelligent Data Analysis for Real-Life Applications10.4018/978-1-4666-1806-0.ch005(72-102)Online publication date: 2012
  • (2012)Subsumption resolutionMachine Language10.1007/s10994-011-5275-287:1(93-125)Online publication date: 1-Apr-2012
  • (2012)Learning by extrapolation from marginal to full-multivariate probability distributionsMachine Language10.1007/s10994-011-5263-686:2(233-272)Online publication date: 1-Feb-2012
  • (2012)Non-Disjoint discretization for aggregating one-dependence estimator classifiersProceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II10.1007/978-3-642-28931-6_15(151-162)Online publication date: 28-Mar-2012
  • (2011)Mixture of truncated exponentials in supervised classification: Case study for the naive bayes and averaged one-dependence estimators classifiers2011 11th International Conference on Intelligent Systems Design and Applications10.1109/ISDA.2011.6121720(593-598)Online publication date: Nov-2011
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media