Skip to main content
Log in

Localized Generalization Error of Gaussian-based Classifiers and Visualization of Decision Boundaries

  • Focus
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

In pattern classification problem, one trains a classifier to recognize future unseen samples using a training dataset. Practically, one should not expect the trained classifier could correctly recognize samples dissimilar to the training dataset. Therefore, finding the generalization capability of a classifier for those unseen samples may not help in improving the classifiers accuracy. The localized generalization error model was proposed to bound above the generalization mean square error for those unseen samples similar to the training dataset only. This error model is derived based on the stochastic sensitivity measure(ST-SM)of the classifiers. We present the ST-SMS for various Gaussian based classifiers: radial basis function neural networks and support vector machine in this paper. At the end of this work, we compare the decision boundaries visualization using the training samples yielding the largest sensitivity measures and the one using support vectors in the input space.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Anthony M, Bartlett PL (1999) Neural network learning: theoretical foundations. Cambridge University Press, Cambridge

    Google Scholar 

  2. Bartlett PL, Williamson RC (1996) The vapnik-chervonenkis dimension and pseudodimension of two-layer neural networks with discrete inputs. Neural Comput 8:653–656

    Google Scholar 

  3. Geman S, Bienenstock E (1992) Neural networks and the bias/variance dilemma. Neural Comput 4:1–58

    Google Scholar 

  4. Hastie T, Tibshirani R, Friedman J (2001) The element of statistical learning. Springer, Berlin Heidelberg Newyork

    Google Scholar 

  5. Ng WWY, Yeung DS, Wang D, Tsang ECC, Wang XZ (2005a) Localized generalization error and its application to RBFNN training. In Proceedings of international conference on machine learning and cybernetics:4667–4673

  6. Ng WWY, Dorado A, Yeung DS, Pedrycz W and Izquierdo E (2005b) Image classification with the use of radial basis function neural networks and the minimization of localized generalization error. Pattern Recogn (in press)

  7. Scholkopf B, Sung KK, Berges CJC, Girosi F, Niyogi P, Poggio T, Vapnik V (1997) Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Trans. Signal Process 45:2758–2765

    Article  Google Scholar 

  8. Vapnik V (1998) Statistical learning theory. Wiley-Interscience, New York

    MATH  Google Scholar 

  9. Chakraborthy D and Pal NR (2003) A Novel training scheme for multilayered perceptrons to realize proper generalization and incremental learning. IEEE Trans. on Neural Networks:1–14

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wing W. Y. Ng.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ng, W.W.Y., Yeung, D.S., Wang, D. et al. Localized Generalization Error of Gaussian-based Classifiers and Visualization of Decision Boundaries. Soft Comput 11, 375–381 (2007). https://doi.org/10.1007/s00500-006-0092-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-006-0092-4

Keywords

Navigation