Skip to main content

Beyond Boosting: Recursive ECOC Learning Machines

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3077))

Abstract

We present a wide experimental work evaluating the behaviour of Recursive ECOC (RECOC) [1] learning machines based on Low Density Parity Check (LDPC) coding structures. We show that owing to the iterative decoding algorithms behind LDPC codes, RECOC multiclass learning is progressively achieved. This learning behaviour confirms the existence of new boosting dimension, the one provided by the coding space. We present a method for searching potential good RECOC codes from LDPC ones. Starting from a properly selected LDPC code, we assess the effect of boosting in both weak and strong binary learners. For nearly all domains, we find that boosting a strong learner like a Decision Tree is as effective as boosting a weak one like a Decision Stump. This surprising result substantiates the hypothesis that weakening strong classifiers by boosting has a decorrelation effect, which can be used to improve RECOC learning.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tapia, E., González, J.C., García Villalba, J., Villena, J.: Recursive Adaptive ECOC models. In: Brazdil, P.B., Jorge, A.M. (eds.) EPIA 2001. LNCS (LNAI), vol. 2258, p. 96. Springer, Heidelberg (2001)

    Google Scholar 

  2. Dietterich, T., Bakiri, G.: Error-correcting output codes: A general method for improving multiclass inductive learning programs. In: Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI 1991), pp. 572–577. AAAI Press, Anaheim (1991)

    Google Scholar 

  3. Crammer, K., Singer, Y.: On the Learnability and Design of Output Codes for Multiclass Problems. In: Cesa-Bianchi, N., Goldman, S.A. (eds.) Proceedings COLT 2000, pp. 35–46. Morgan Kaufmann, Palo Alto (2000)

    Google Scholar 

  4. Allwein, E., Schapire, R., Singer, Y.: Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers. Journal of Machine Learning Research 1, 113–141 (2000)

    Article  MathSciNet  Google Scholar 

  5. Tapia, E., González, J.C., García, J.: Good Error Correcting Output Codes For Adaptive Multiclass Learning. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  6. Tanner, M.: A recursive approach to Low Complexity Error Correcting Codes. IEEE Trans. Inf. Theory 27, 533–547 (1981)

    Article  MATH  MathSciNet  Google Scholar 

  7. Kschischang, F., Frey, B.: Iterative decoding of compound codes by probability propagation in graphical models. IEEE Journal on Sel. Areas in Communications 16(2), 219–230 (1998)

    Article  Google Scholar 

  8. Gallager, R.G.: Low Density Parity-Check Codes. M.I.T. Press, Cambridge (1963)

    Google Scholar 

  9. MacKay, D.J.: Good Error Correcting Codes based on Very Sparse Matrices. IEEE Trans. Inf. Theory 45, 399–431 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  10. Guruswami, V., Sahai, A.: Multiclass Learning, Boosting, and Error-Correcting Codes. In: Proceedings COLT 1999, Santa Cruz, CA, USA, pp. 145–155 (1999)

    Google Scholar 

  11. Masulli, F., Valentini, G.: Dependence among Codeword Bits Error in ECOC Learning: An Experimental Analysis. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, p. 158. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  12. Schapire, R.E., Singer, Y.: Improved Boosting Algorithms Using Confidence - rated Predictions. Machine Learning 37(3), 277–296 (1999)

    Article  MATH  Google Scholar 

  13. Tapia, E., González, J.C., García-Villalba, J.: A Generalized Class of Boosting Algorithms based on Recursive Error Correcting Codes. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, p. 22. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  14. Witten, I., Frank, E.: Data Mining, Practical Machine Learning Tools and Techniques with JAVA Implementations. Morgan Kaufmann Publishers, San Francisco (2000)

    Google Scholar 

  15. Furkranz, J.: Round Robin Classification. Journal of Machine Learning Research 2, 721–747 (2002)

    Article  Google Scholar 

  16. Rifkin, R.: Everything Old is New Again: A Fresh Look at the Historical Approaches in Machine Learning. Ph.D. Thesis. Massachusetts Institute of Technology (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tapia, E., González, J.C., Hütermann, A., García, J. (2004). Beyond Boosting: Recursive ECOC Learning Machines. In: Roli, F., Kittler, J., Windeatt, T. (eds) Multiple Classifier Systems. MCS 2004. Lecture Notes in Computer Science, vol 3077. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-25966-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-25966-4_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22144-9

  • Online ISBN: 978-3-540-25966-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics