Skip to main content

Improving Product by Moderating k-NN Classifiers

  • Conference paper
  • First Online:
Multiple Classifier Systems (MCS 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2096))

Included in the following conference series:

Abstract

The veto effect caused by contradicting experts outputting zero probability estimates leads to fusion strategies performing sub optimally. This can be resolved using Moderation. The Moderation formula is derived for the k-NN classifier using a bayesian prior. The merits of moderation are examined on real data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. F. M. Alkoot and J. Kittler. Experimental evaluation of expert fusion strategies. Pattern Recognition Letters, 20(11-13):1361–1369, 1999.

    Article  Google Scholar 

  2. Fuad M. Alkoot and J. Kittler. Improving the performance of the product fusion strategy. In Proceedings of the 15th IAPR International Conference on Pattern Recognition, volume 2, pages 164–167, Barcelona, Spain, 2000.

    Article  Google Scholar 

  3. Fuad M. Alkoot and Josef Kittler. Population bias control for bagging knn experts. In In proceedings of Sensor Fusion: Architectures, Algorithms, and Applications V, Orlando, Fl, USA, 2001. SPIE.

    Google Scholar 

  4. J. Cao, M. Ahmadi, and M. Shridhar. Recognition of handwritten numerals with multiple feature and multistage classifier. Pattern Recognition, 28(2):153–160, 1995.

    Article  Google Scholar 

  5. T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, pages 1–22, 1998.

    Google Scholar 

  6. T.K. Ho, J.J. Hull, and S.N. Srihari. Decision combination in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(1):66–75, 1994.

    Article  Google Scholar 

  7. M I Jordan and R A Jacobs. Hierarchical mixture of experts and the em algorithm. Neural Computation, 6:181–214, 1994.

    Article  Google Scholar 

  8. J. Kittler. Combining classifiers: A theoretical framework. Pattern Analysis and Applications, 1:18–27, 1998.

    Article  Google Scholar 

  9. J. Kittler, M. Hatef, R. Duin, and J. Matas. On combining classifiers. IEEE Transaction on Pattern Analysis and Machine Intelligence, 20(3):226–239, 1998.

    Article  Google Scholar 

  10. D. Lee and S. Srihari. Dynamic classifier combination using neural networks. SPIE, 2422:26–37, 1995.

    Article  Google Scholar 

  11. P. Murphy. Repository of machine learning databases and domain theories. ftp://ftp.ics.uci.edu/pub/machine-learning-databases, 1999.

  12. G. Rogova. Combining the results of several neural network classifiers. Neural Networks, 7(5):777–781, 1994.

    Article  Google Scholar 

  13. K. Tumer and J. Ghosh. Order statistics combiners for neural classifiers. In Proceedings of the World Congress on Neural Networks, volume I, pages 31–34, Washington, DC., 1995.

    Google Scholar 

  14. D.H. Wolpert. Stacked generalisation. Neural Networks, 5(2):241–260, 1992.

    Article  Google Scholar 

  15. K Woods, W P Kegelmeyer, and K Bowyer. Combination of multiple experts using local accuracy estimates. IEEE Transaction Pattern Analysis and Machine Intelligence, 19:405–410, 1997.

    Article  Google Scholar 

  16. L. Xu, A. Krzyzak, and C.Y. Suen. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transaction. SMC, 22(3):418–435, 1992.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Alkoot, F.M., Kittler, J. (2001). Improving Product by Moderating k-NN Classifiers. In: Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2001. Lecture Notes in Computer Science, vol 2096. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48219-9_43

Download citation

  • DOI: https://doi.org/10.1007/3-540-48219-9_43

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42284-6

  • Online ISBN: 978-3-540-48219-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics