Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3641))

Abstract

In this paper, a new variant of Bagging named DepenBag is proposed. This algorithm obtains bootstrap samples at first. Then, it employs a causal discoverer to induce from each sample a dependency model expressed as a Directed Acyclic Graph (DAG). The attributes without connections to the class attribute in all the DAGs are then removed. Finally, a component learner is trained from each of the resulted samples to constitute the ensemble. Empirical study shows that DepenBag is effective in building ensembles of nearest neighbor classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aha, D.W.: Lazy learning: special issue editorial. Artificial Intelligence Review 11, 7–10 (1997)

    Article  Google Scholar 

  2. Alkoot, F.M., Kittler, J.: Moderating k-NN classifiers. Pattern Analysis & Applications 5, 326–332 (2002)

    Article  MathSciNet  Google Scholar 

  3. Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases, Department of Information and Computer Science, University of California, Irvine, CA (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  4. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  5. Chickering, M.: Learning equivalence classes of bayesian networks structures. In: Proceedings of the 12th International Conference on Uncertainty in Artificial Intelligence, Portland, OR, pp. 150–157 (1996)

    Google Scholar 

  6. Cooper, G.F., Herskovits, E.: A bayesian method for the induction of probabilistic networks from data. Machine Learning 9, 309–347 (1992)

    MATH  Google Scholar 

  7. Dai, H., Li, G.: An improved approach for the discovery of causal models via MML. In: Proceedings of the 6th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Taipei, Taiwan, pp. 304–315 (2002)

    Google Scholar 

  8. Dasarathy, B.V.: Nearest Neighbor Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)

    Google Scholar 

  9. Dietterich, T.G.: Ensemble learning. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks, 2nd edn. MIT Press, Cambridge (2002)

    Google Scholar 

  10. Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)

    MATH  Google Scholar 

  11. Hall, L.O., Bowyer, K.W., Banfield, R.E., Bhadoria, D., Eschrich, S.: Comparing pure parallel ensemble creation techniques against bagging. In: Proceedings of the 3rd IEEE International Conference on Data Mining, Melbourne, FL, pp. 533–536 (2003)

    Google Scholar 

  12. Heckerman, D., Geiger, D., Chickering, D.M.: Learning bayesian networks: the combination of knowledge and statistical data. Machine Learning 20, 197–243 (1995)

    MATH  Google Scholar 

  13. Kononenko, I.: Estimating attributes: analysis and extensions of relief. In: Proceedings of the 7th European Conference on Machine Learning, Catania, Italy, pp. 171–182 (1994)

    Google Scholar 

  14. Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Advances in Neural Information Processing Systems, vol. 7, pp. 231–238. MIT Press, Cambridge (1995)

    Google Scholar 

  15. Kuncheva, L.I.: Diversity in multiple classifier systems: special issue editorial. Information Fusion 6, 3–4 (2005)

    Article  Google Scholar 

  16. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles. Machine Learning 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  17. Latinne, P., Debeir, O., Decaestecker, C.: Mixing bagging and multiple feature subsets to improve classification accuracy of decision tree combination. In: Proceedings of the 10th Belgian-Dutch Conference on Machine Learning, Tilburg, The Netherlands (2000)

    Google Scholar 

  18. Liu, H., Setiono, R.: Chi2: feature selection and discretization of numeric attributes. In: Proceedings of the 7th IEEE International Conference on Tools with Artificial Intelligence, Washington, DC, pp. 388–391 (1995)

    Google Scholar 

  19. Murphy, K.: The bayes net toolbox for matlab. Computing Science and Statistics 33, 331–351 (2001)

    Google Scholar 

  20. Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Mateo (1988)

    Google Scholar 

  21. Robinson, R.W.: Counting unlabelled acyclic digraphs. In: Proceedings of the 5th Australian Conference on Combinatorial Mathematics, Melbourne, Australia, pp. 28–43 (1976)

    Google Scholar 

  22. Zhou, Z.-H., Yu, Y.: Adapt bagging to nearest neighbor classifiers. Journal of Computer Science and Technology 20, 48–54 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jiang, Y., Ling, JJ., Li, G., Dai, H., Zhou, ZH. (2005). Dependency Bagging. In: Ślęzak, D., Wang, G., Szczuka, M., Düntsch, I., Yao, Y. (eds) Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing. RSFDGrC 2005. Lecture Notes in Computer Science(), vol 3641. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11548669_51

Download citation

  • DOI: https://doi.org/10.1007/11548669_51

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28653-0

  • Online ISBN: 978-3-540-31825-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics