Skip to main content
Log in

Adaptive internal representation in circular back-propagation networks

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

Surface- and prototype-based models are often regarded as alternative paradigms to represent internal knowledge in trained neural networks. This paper analyses a network model (Circular Back-Propagation) that overcomes such dualism by choosing the best-fitting representation adaptively. The model involves a straightforward modification to classical feed-forward structures to let neurons implement hyperspherical boundaries; as a result, it exhibits a notable representation power, and benefits from the simplicity and effectiveness of classical back-propagation training. Artificial testbeds support the model definition by demonstrating its basic properties; an application to a real, complex problem in the clinical field shows the practical advantages of the approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Rumelhart DE, Hinton GH, Williams RJ. Learning internal representation by error propagation. In: Rumelhart DE, McClelland JL (Eds) Parallel Distributed Processing — Explorations in the Microstructure of Cognition, Vol. 1, MIT Press, 1986

  2. Bichsel M, Seitz P. Minimum class entropy: a maximum information approach to layered networks. Neural Networks 1989;2: 133–141

    Google Scholar 

  3. Kohonen T. Self-Organization and Associative Memory, 3rd Ed, Springer-Verlag, 1989

  4. Martinetz TM, Berkovich SG, Schulten KJ. Neuralgas network for vector quantization and its application to time-series prediction. IEEE Trans Neural Networks 1993;4:558–569

    Google Scholar 

  5. Martinetz TM, Schulten KJ. Topology representing networks. Neural Networks 1994:7: 507–522

    Google Scholar 

  6. Specht DF. Probabilistic neural networks. Neural Networks 1990;3: 109–118

    Google Scholar 

  7. Casasent DP. Multifunctional hybrid neural net. Neural Networks 1992;5: 361–370

    Google Scholar 

  8. Telfer BA, Casasent DP. Minimum-cost associative processor for piecewise-hyperspherical classification. Neural Networks 1993;6: 1117–1130

    Google Scholar 

  9. Sarajedini A, Hecht-Nielsen R. The best of both worlds: Casasent networks integrate multilayer perceptrons and radial basis functions. In: Proceedings of the Int. Joint Conf. on Neural Networks (IJCNN '92), Baltimore, MD, 1992; III: 905–910

    Google Scholar 

  10. Lianglu B, Bai Y, Kita H, Nishikawa Y. An efficient multilayer quadratic perceptron for pattern classification and function approximation. Proceedings of the Int. Joint Conf. on Neural Networks (IJCNN '93), Nagoya, Japan, 1993; 1385–1388

  11. Anguita D, Parodi GC, Zunino R. An efficient implementation of BP on RISC-based workstations. Neurocomputing 1994;6: 57–65

    Google Scholar 

  12. Pearlmutter BA. Fast exact multiplication by the Hessian. Neural Computation 1994;6: 147–160

    Google Scholar 

  13. Møller M. A scaled conjugate gradient algorithm for fast supervised learning. Neural Networks 1993;6: 525–533

    Google Scholar 

  14. Ridella S, Speroni GL, Trebino P, Zunino R. Class-entropy minimization networks for domain analysis and rule extraction. Neural Comput Applic 1994;2: 40–52

    Google Scholar 

  15. Moneta C, Parodi GC, Rovetta S, Zunino R. Automated diagnosis and disease characterization using neural network analysis. In: IEEE Int. Conf. on Systems, Man and Cybern., Chicago, IL, 1992; 123–128

  16. Bianchi G, Buffrini L, Monteforte P, Rovetta G, Rovetta S, Zunino R. Neural approaches to the diagnosis and characterization of the Lyme disease. In: 7th IEEE Symp. Computer-based Medical Systems, Winston-Salem, USA, 1994; 194–199

  17. Poggio T, Girosi F. Networks for approximation and learning. Proc IEEE 199078: 1481–1497

    Google Scholar 

  18. Barinaga M. Furor at Lyme disease conference. Science 1992;256: 1384

    Google Scholar 

  19. Kramer MA. Nonlinear principal component analysis using autoassociative neural networks. AIChE J 1991;37: 233–243

    Google Scholar 

  20. Anthony M, Biggs N. Computational Learning Theory, Cambridge University Press, 1992

  21. Vapnik VN, Chervonenkis AY. On the uniform convergence of relative frequencies of events to their probabilities. Th Prob Appl 1971;17: 264–280

    Google Scholar 

  22. Baum EB, Haussler D. What size net gives valid generalization? Neural Computation 1989;1: 151–160

    Google Scholar 

  23. Wenocur RS, Dudley RM. Some special Vapnik-Chervonenkis classes. Discrete Mathematics 1981;33: 313–318

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ridella, S., Rovetta, S. & Zunino, R. Adaptive internal representation in circular back-propagation networks. Neural Comput & Applic 3, 222–233 (1995). https://doi.org/10.1007/BF01414647

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01414647

Keywords

Navigation