Skip to main content

Learning rare categories in backpropagation

  • Conference paper
  • First Online:
  • 139 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 991))

Abstract

Hierarchical systems of neural networks based on the backpropagation algorithm were used to test the hypothesis that rare categories could be learned more accurately and in shorter training times than in nonhierarchical neural networks also based on the backpropagation algorithm. In two artificial data sets, the problem of learning rare categories was quantified and an existing solution was shown to be inadequate. HNNs were compared to nonhierarchical neural networks. In both artificial examples, HNNs performed better than nonhierarchical neural networks in terms of sensitivity and time to train. Specificities were not significantly different. In two real-world examples, these results were confirmed.

This is a preview of subscription content, log in via an institution.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rumelhart DE; Hinton GE; Williams RJ. Learning internal representation by error propagation. In Rumelhart, D.E., and McClelland, J.L. (eds) Parallel Distributed Processing. MIT Press, Cambridge, 1986.

    Google Scholar 

  2. Hertz JA; Palmer RG; Krogh, AS. Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City, 1991.

    Google Scholar 

  3. Gray NAB. Constraints on “learning machine” classification methods. Analytical Chemistry, 1976, 48(14):2265–8.

    Article  Google Scholar 

  4. Ohno-Machado L. Identification of Low Frequency Patterns in Backpropagation Neural Networks. JAMIA Symposium Supplement, 1994, 853–9.

    Google Scholar 

  5. Ohno-Machado L; Walker MG; Musen MA. Hierarchical Neural Networks for Survival Analysis. Proceedings of the MEDINFO, 1995.

    Google Scholar 

  6. Sunil EVT; Shin YC; Kumara SRT. Machining Conditioning Monitoring via Neural networks. ASME Winter Annual Meeting, in Monitoring and Control of Manufacturing Processes. SY Liang and TC Tsao (Co-eds), PED-VOL 44, ASME Publications, pp85–95, December 1990

    Google Scholar 

  7. Jordan RA; Nowlan SJ; Hinton SJ. Adaptive mixtures of local experts. Neural Computation, 1991, 3:79–87.

    Google Scholar 

  8. Curry B; Rumelhart DE. MSnet: A neural network that classifies mass spectra. Tetrahedron Computer Methodology, 1990, 3:213–37.

    Article  Google Scholar 

  9. Shepard RN. Multidimensional scaling, tree-fitting, and clustering. Science, 1980, 210:390–8.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Jacques Wainer Ariadne Carvalho

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ohno-Machado, L., Musen, M.A. (1995). Learning rare categories in backpropagation. In: Wainer, J., Carvalho, A. (eds) Advances in Artificial Intelligence. SBIA 1995. Lecture Notes in Computer Science, vol 991. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0034813

Download citation

  • DOI: https://doi.org/10.1007/BFb0034813

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-60436-5

  • Online ISBN: 978-3-540-47467-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics