Skip to main content

Optimizing RBF Networks with Cooperative/Competitive Evolution of Units and Fuzzy Rules

  • Conference paper
  • First Online:
Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence (IWANN 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2084))

Included in the following conference series:

Abstract

This paper presents a new evolutionary method to design optimal networks of Radial Basis Functions (RBFs). The main characteristics of this method lie in the estimation of the fitness of a neurone in the population and in the choice of the operator to apply; for this latter objective, a set of fuzzy rules is used. Thus, the estimation of the fitness considered here, is done by considering three main factors: the weight of the neuron in the RBF Network, the overlapping among neurons, and the distances from neurons to the points where the approximation is worst. These factors allow us to define a fitness function in which concepts such as cooperation, speciation, and niching are taken into account. These three factors are also used as linguistic variables in a fuzzy logic system to choose the operator to apply. The proposed method has been tested with the Mackey-Glass series.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. C. Mackey and L. Glass, “Oscillation and chaos in physiological control system” Sci vol. 197, pp. 287–289, 1.977.

    Google Scholar 

  2. J. Platt, “A resource-allocating network for function interpolation”, Neural Computation 3, 213–225. 1.991

    Google Scholar 

  3. L. Yingwei, N. Sundararajan, P. Saratchandran, “A sequential learning scheme for function approximation using minimal radial basis function neural networks.” Neural Computation 9, 461–478. 1.997

    Google Scholar 

  4. J. Moody and C. Darken, “Fast learning networks of locally-tuned processing units,” Neural Computation, vol. 3 n. 4 pp.579–588, 1.991

    Google Scholar 

  5. Whitehead, B. A.; Choate, T. D.:“Cooperative-competitive genetic evolution of Radial Basis Function centers and widths for time series prediction”. IEEE Trans. on Neural Networks, Vol. 7, No. 4, pp.869–880. July, 1996.

    Article  Google Scholar 

  6. Angeline, P. J.; Saunders, G. M.; Pollack, J. B.:“An evolutionary algorithm that constructs recurrent neural networks”. IEEE Trans. on Neural Networks, Vol. 5, No. 1, pp.54–65. January, 1994.

    Article  Google Scholar 

  7. Maniezzo, V.:“Genetic evolution of the topology and weight distribution of neural networks”. IEEE Trans. on Neural Networks, Vol. 5, No. 1, pp.39–53. January, 1994.

    Article  Google Scholar 

  8. [8] Fahlman, S. E.; Lebiere, C.:“The cascade-correlation learning architecture”. In Advances in Neural Information Processing Systems, 2, Lippmann, P.; Moody, J. E.; and Touretzky, D. S. (eds.), Morgan Kaufmann, pp.524–532, 1991.

    Google Scholar 

  9. Hwang, J.-N.; Lay, S.-R.; Maechler, M.; Martin, R. D.; Schiemert, J.:“Regression modeling in backpropagation and projection pursuit learning”. IEEE Trans. on Neural Networks, Vol. 5, No. 3, pp.342–353. March, 1994.

    Article  Google Scholar 

  10. Smalz, R.; Conrad, M.:“Combining evolution with credit apportionment: a new learning algorithm for neural nets”. Neural Netwroks, Vol. 7, No. 2, pp.341–351, 1994.

    Article  Google Scholar 

  11. [11] Horn, J.; Goldberg, D. E.; Deb, K.:“Implicit niching in learning classifier system: Nature’s way”. Evolutionary Computation, Vol. 2, No. 1, pp.37–66, 1994.

    Article  Google Scholar 

  12. Coello, C. C.:“An Updated Survey of Evolutionary Multiobjective Optimization Techniques: State of the art and future trends”. Congress on Evolutionary Computation, CEC’99, pp.3–13, 1999.

    Google Scholar 

  13. Whitehead, B. A.; Choate, T. D.:“Evolving space-filling curves to distribute radial basis functions over an input space”. IEEE Trans. on Neural Networks, Vol. 5, No. 1, pp.15–23. January, 1994.

    Article  Google Scholar 

  14. Sareni, B.; Krähenbühl, L.:“Fitness sharing and niching methods revisited”. IEEE Trans. on Evolutionary Computation, Vol. 2, No. 3, pp.97–106. September, 1998.

    Article  Google Scholar 

  15. Rivera, A.; Ortega, J.; Prieto A.:”Design of RBF networks by cooperative/competitive evolution of units” International Conference on Artificial Neural Networks and Genetic Algorithms, ICANNGA 2001. April 2001.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rivera, A.J., Ortega, J., Rojas, I., Prieto, A. (2001). Optimizing RBF Networks with Cooperative/Competitive Evolution of Units and Fuzzy Rules. In: Mira, J., Prieto, A. (eds) Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence. IWANN 2001. Lecture Notes in Computer Science, vol 2084. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45720-8_68

Download citation

  • DOI: https://doi.org/10.1007/3-540-45720-8_68

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42235-8

  • Online ISBN: 978-3-540-45720-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics