Skip to main content

A Prediction Interval Estimation Method for KMSE

  • Conference paper
Advances in Natural Computation (ICNC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3610))

Included in the following conference series:

  • 1424 Accesses

Abstract

The kernel minimum squared error estimation (KMSE) model can be viewed as a general framework that includes kernel Fisher discriminant analysis (KFDA), least squares support vector machine (LS-SVM), and kernel ridge regression (KRR) as its particular cases. For continuous real output the equivalence of KMSE and LS-SVM is shown in this paper. We apply standard methods for computing prediction intervals in nonlinear regression to KMSE model. The simulation results show that LS-SVM has better performance in terms of the prediction intervals and mean squared error(MSE). The experiment on a real date set indicates that KMSE compares favorably with other method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chryssolouris, G., Lee, M., Ramsey, A.: Confidence interval prediction for neural network models. IEEE Trans. Neural Networks 7, 229–232 (1996)

    Article  Google Scholar 

  2. De Veaux, R.D., Psichogios, D.C., Ungar, L.H.: A comparison of two nonparametric estimation schemes: MARS and neural networks. Computers in Chemical Engineering 17, 819–837 (1993)

    Article  Google Scholar 

  3. De Veaux, R.D., Schumi, J., Schweinsberg, J., Ungar, L.H.: Prediction interval for neural networks via nonlinear regression. Technometrics 40, 273–282 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  4. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Muller, K.-R.: Fisher discriminant analysis with kernels. In: Neural Networks for Signal Processing, vol. IX, pp. 41–48. IEEE press, New York (1999)

    Google Scholar 

  5. Saunders, C., Gammerman, A., Vork, V.: Ridge regression learning algorithm in dual variable. In: Proceedings of the 15th International Conference on Machine Learning, pp. 515–521 (1998)

    Google Scholar 

  6. Seok, K., Hwang, C., Cho, D.: Prediction intervals for support vector machine regression. Communications in Statistics: Theory and Methods 31, 1887–1898 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. Shao, R., Martin, E.B., Zhang, J., Morris, A.J.: Connfidence bounds for neural network representations. Computers Chem. Engng. 21(suppl.), S1173–S1178 (1997)

    Google Scholar 

  8. Suykens, J.A.K., Lukas, L., Van Dooren, P., De Moor, B., Vandewalle, J.: Least squares support vector machine classifiers: a large scale algorithm. In: European Conference on Circuit Theory and Design, ECCTD 1999, Stresa Italy, August 1999, pp. 839–842 (1999)

    Google Scholar 

  9. Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9, 293–300 (1998)

    Article  Google Scholar 

  10. Suykens, J.A.K., Van Gestel, T., De Brabanter, J.D., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)

    Book  MATH  Google Scholar 

  11. Vapnik, V.: Statistical Learning Theory. Springer, New York (1998)

    MATH  Google Scholar 

  12. Xu, J., Zhang, X., Li, Y.: Kernel MSE algorithm: A unified framework for KFD, LS-SVM. In: Proceedings of IJCNN 2001, vol. 2, pp. 1486–1491 (2001)

    Google Scholar 

  13. Yang, L., Kavli, T., Carlin, M., Clausen, S., de Groot, P.F.M.: An evaluation of confidence bound estimation methods for neural networks. In: Proceeding of ESIT, pp. 322–329 (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hwang, C., Seok, K.H., Cho, D. (2005). A Prediction Interval Estimation Method for KMSE. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_69

Download citation

  • DOI: https://doi.org/10.1007/11539087_69

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28323-2

  • Online ISBN: 978-3-540-31853-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics