Skip to main content
Log in

Design of a radial basis function neural network with a radius-modification algorithm using response surface methodology

  • Published:
Journal of Intelligent Manufacturing Aims and scope Submit manuscript

Abstract

A radial basis function (RBF) neural network was designed for time series forecasting using both an adaptive learning algorithm and response surface methodology (RSM). To improve the traditional RBF network‘s forecasting capability, the generalized delta rule learning method was employed to modify the radius of the kernel function. Then RSM was utilized to explore the mean square error response surface so that the appropriate combination of network parameters, such as the number of hidden nodes and the initial learning rates, could be found. Extensive studies were performed on the effect of the initial values of connection weights on the accuracy of the backpropagation learning method that was employed in the training of the RBF artificial neural network. The effectiveness of the neural network with the proposed radius-modification technique and the RSM method was demonstrated with an example of forecasting intensity pulsations of a laser. It was found that, by utilizing the proposed techniques, the neural network provided a more accurate prediction of the response.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Box, G. E. P. and Draper, N. R. (1987) Empirical Model-Building and Response Surface, John Wiley, New York.

    Google Scholar 

  • Broomhead, D. S. and Lowe, D. (1988) Multivariable functional interpolation and adaptive networks. Complex Systems, 2, 321–355.

    Google Scholar 

  • Casdagli, M. (1989) Nonlinear prediction of chaotic time series. Physica D., 35, 335–356.

    Google Scholar 

  • Chiu, C. C., Cook, D. F. and Whittaker, A. D. (1996) An adaptive algorithm for the radial basis function network. Journal of Artificial Neural Networks (in review).

  • Cox, D. R. (1984) Present position and potential developments: some personal views: designs of experiments and regression. Journal of the Royal Statistical Society, Ser.A, 147, 306–315.

    Google Scholar 

  • Cybenko, G. (1989) Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems, 2(4), 303–314.

    Google Scholar 

  • Draper, R. and John, J. A. (1988) Response-surface designs for quantitative and qualitative variables. Technometrics, 30(4), 423–428.

    Google Scholar 

  • Duda, R. O. and Hart, P. E. (1973) Pattern Classification and Scene Analysis, John Wiley & Sons, New York.

    Google Scholar 

  • Forgy, E. W. (1976) Cluster analysis of multivariate data: effcient versus interpretability of classifications. Biometrics Society Meetings, Riverside, CA, Biometrics 21(3) 768.

    Google Scholar 

  • Hecht-Nielsen, R. (1991) Neurocomputing, Addison-Wesley, Reading, MA.

    Google Scholar 

  • Ho, K. L., Hsu, Y. Y. and Yang, C. C. (1992), Short term load forecasting using a multilayer neural network with an adaptive learning algorithm. IEEE Transactions on Power Systems, 7(1), 141–149.

    Google Scholar 

  • Hopfield, J. and Tank, D. (1985) `Neural’ computation of decisions in optimization problems. Biological Cybernetics, 52, 141–152.

    Google Scholar 

  • Hornik, K. M., Stinchcombe, M. and White, H. (1989) Multiple feedforward networks are universal approximations. Neural Computations, 2, 210–215.

    Google Scholar 

  • Jones, R. D., Lee, Y. C., Barnes, C. W., Flake, G. W., Lee, K., Lewis, P. S. and Qian, S. (1990), Functional approximation and time series prediction with neural networks, Tech. Rep. La-UR-90-21, Los Alamos National Library, Los Alamos, NM.

    Google Scholar 

  • Kung, S. Y. and Hwang, J. N. (1988) An algebraic projection analysis for optimal hidden units size and learning rate in back-propagation learning, in Proceedings IEEE International Conference on Neural Networks, San Diego, CA, pp. 363–370.

  • Lentner, M. and Bishop, T. (1986) Experimental Design and Analysis Valley Book Company, Blacksburg, VA.

    Google Scholar 

  • Lippmann, R. P. (1989) Pattern classification using neural networks. IEEE Computation Magazine, 27, 47–64.

    Google Scholar 

  • MacQueen, J. B. (1967) Some methods for classification and analysis of multivariate observations, in Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, 1, pp. 281–297.

    Google Scholar 

  • Micchelli, C. A. (1986) Interpolation of scattered data: distance matrices and conditionally positive definite functions. Constructive Approximation, 2, 11–22.

    Google Scholar 

  • Moody, J. and Darken, C. J. (1989) Fast learning in networks of locally tuned processing units. Neural Computations, 1, 281–294.

    Google Scholar 

  • Musavi, M. T., Ahmed, W., Chan, K. H., Faris, K. B. and Hummels, D. M. (1992) On the training of radial basis function classifiers. Neural Networks, 5, 595–603.

    Google Scholar 

  • Peeling, R. K. M. and Tomlinson, M. J. (1986) The multi-layer perceptron as a tool for speech pattern processing research, in Proceedings IoA Autumn Conference on Speech and Hearing, San Diego, CA, pp. 57–65.

  • Powell, M. J. D. (1987) Radial basis functions for multivariable interpolation: a review, in Algorithms for the Approximation of Functions and Data, Mason, J. C. and Cox, M. G. (eds), Clarendon Press, Oxford, pp. 143–167.

    Google Scholar 

  • Renals, S. and Rohwer, R. (1989) Phoneme classification experiments using radial basis function, in Proceedings of International Joint Conference on Neural Networks, I, pp. 461–467.

    Google Scholar 

  • Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986) Learning internal representations by error propagation, in Parallel Distributed Processing: Explorations in the Microstructures of Cognition, Vol. 1, Foundations, Rumelhart, D. E. and McClell, J. L. (eds), MIT Press, Cambridge, MA, pp. 86–89.

    Google Scholar 

  • Saha, A., Christian, J., Tang, D. S. and Wu, C. L. (1990) Oriented non-radial basis functions for image coding and analysis. Proceedings of IEEE, Neural Information Processing Systems, 3, 728–734.

    Google Scholar 

  • Spath, H. (1980) Cluster Analysis Algorithms for Data Reduction and Classification of Objects, Ellis Horwood, New York.

    Google Scholar 

  • Vrckovnik, G., Carter, C. R. and Haykin, S. (1990) Radial basis function classification of impulse radar waveforms, in Proceedings of International Joint Conference on Neural Networks, I, 45–50.

    Google Scholar 

  • Weigend, A. S. and Gershenfield, N. A. (1993) Time Series Prediction: Forecasting the Future and Understanding the Past, Addison-Wesley, Reading, MA.

    Google Scholar 

  • Weisberg, S. (1985) Applied Linear Regression, 2nd edn, John Wiley, New York.

    Google Scholar 

  • Whittaker, A. D. and Cook, D. F. (1995) Counterpropagation neural network for modeling of a continuous correlated process. International Journal of Production Research, 33(7), 1901–1910.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

CHIU , CC., COOK , D.F., Jr , J.J.P. et al. Design of a radial basis function neural network with a radius-modification algorithm using response surface methodology. Journal of Intelligent Manufacturing 8, 117–124 (1997). https://doi.org/10.1023/A:1018504704266

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1018504704266

Navigation