Abstract
The veto effect caused by contradicting experts outputting zero probability estimates leads to fusion strategies performing sub optimally. This can be resolved using Moderation. The Moderation formula is derived for the k-NN classifier using a bayesian prior. The merits of moderation are examined on real data sets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
F. M. Alkoot and J. Kittler. Experimental evaluation of expert fusion strategies. Pattern Recognition Letters, 20(11-13):1361–1369, 1999.
Fuad M. Alkoot and J. Kittler. Improving the performance of the product fusion strategy. In Proceedings of the 15th IAPR International Conference on Pattern Recognition, volume 2, pages 164–167, Barcelona, Spain, 2000.
Fuad M. Alkoot and Josef Kittler. Population bias control for bagging knn experts. In In proceedings of Sensor Fusion: Architectures, Algorithms, and Applications V, Orlando, Fl, USA, 2001. SPIE.
J. Cao, M. Ahmadi, and M. Shridhar. Recognition of handwritten numerals with multiple feature and multistage classifier. Pattern Recognition, 28(2):153–160, 1995.
T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, pages 1–22, 1998.
T.K. Ho, J.J. Hull, and S.N. Srihari. Decision combination in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(1):66–75, 1994.
M I Jordan and R A Jacobs. Hierarchical mixture of experts and the em algorithm. Neural Computation, 6:181–214, 1994.
J. Kittler. Combining classifiers: A theoretical framework. Pattern Analysis and Applications, 1:18–27, 1998.
J. Kittler, M. Hatef, R. Duin, and J. Matas. On combining classifiers. IEEE Transaction on Pattern Analysis and Machine Intelligence, 20(3):226–239, 1998.
D. Lee and S. Srihari. Dynamic classifier combination using neural networks. SPIE, 2422:26–37, 1995.
P. Murphy. Repository of machine learning databases and domain theories. ftp://ftp.ics.uci.edu/pub/machine-learning-databases, 1999.
G. Rogova. Combining the results of several neural network classifiers. Neural Networks, 7(5):777–781, 1994.
K. Tumer and J. Ghosh. Order statistics combiners for neural classifiers. In Proceedings of the World Congress on Neural Networks, volume I, pages 31–34, Washington, DC., 1995.
D.H. Wolpert. Stacked generalisation. Neural Networks, 5(2):241–260, 1992.
K Woods, W P Kegelmeyer, and K Bowyer. Combination of multiple experts using local accuracy estimates. IEEE Transaction Pattern Analysis and Machine Intelligence, 19:405–410, 1997.
L. Xu, A. Krzyzak, and C.Y. Suen. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transaction. SMC, 22(3):418–435, 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Alkoot, F.M., Kittler, J. (2001). Improving Product by Moderating k-NN Classifiers. In: Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2001. Lecture Notes in Computer Science, vol 2096. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-48219-9_43
Download citation
DOI: https://doi.org/10.1007/3-540-48219-9_43
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42284-6
Online ISBN: 978-3-540-48219-2
eBook Packages: Springer Book Archive