Skip to main content
Log in

Supervised extended ART: A fast neural network classifier trained by combining supervised and unsupervised learning

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

A neural network classifier, called supervised extended ART (SEART), that incorporates a supervised mechanism into the extended unsupervised ART is presented here. It uses a learning theory called Nested Generalized Exemplar (NGE) theory. In any time, the training instances may or may not have desired outputs, that is, this model can handle supervised learning and unsupervised learning simultaneously. The unsupervised component finds the cluster relations of instances, and the supervised component learns the desired associations between clusters and classes. In addition, this model has the ability of incremental learning. It works equally well when instances in a cluster belong to different classes. Also, multi-category and nonconvex classifications can be dealt with. Besides, the experimental results are very encouraging.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. M. Bahrami, “Recognition of rules and exceptions by neural networks,” International Journal of Neural Systems, vol. 2, pp. 341–344, 1992.

    Google Scholar 

  2. G.A. Carpenter and S. Grossberg, “A massively parallel architecture for a self-organizing neural pattern recognition machine,” Computer Vision, Graphics, and Image Processing, vol. 37, pp. 54–115, 1987.

    Google Scholar 

  3. G.A. Carpenter and S. Grossberg, “ART 2: Self-organization of stable category recognition codes for analog input patterns,” Applied Optics, vol. 26, pp. 4919–4930, 1987.

    Google Scholar 

  4. G.A. Carpenter, “Neural network models for pattern recognition and associative memory,” Neural Networks, vol. 2, pp. 243–257, 1989.

    Google Scholar 

  5. G.A. Carpenter and S. Grossberg, “ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architectures,” Neural Networks, vol. 3, pp. 129–152, 1990.

    Google Scholar 

  6. G.A. Carpenter, S. Grossberg, and D.B. Rosen, “Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance system,” Neural Networks, vol. 4, pp. 759–771, 1991.

    Google Scholar 

  7. G.A. Carpenter, S. Grossberg, and D.B. Rosen, “ART 2-A: An adaptive resonance algorithm for rapid category learning and recognition,” Neural Networks, vol. 4, pp. 493–504, 1991.

    Google Scholar 

  8. G.A. Carpenter, S. Grossberg, and J.H. Reynolds, “ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural network,” Neural Networks, vol. 4, pp. 565–588, 1991.

    Google Scholar 

  9. G.A. Carpenter, S. Grossberg, N. Markuzon, J.H. Reynolds, and D.B. Rosen, “Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps,” IEEE Transactions on Neural Networks, vol. 3, pp. 689–713, 1992.

    Google Scholar 

  10. R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis, John Wiley and Sons: New York, 1973.

    Google Scholar 

  11. P.W. Frey and D.J. Slate, “Letter recognition using Hollandstyle adaptive classifiers,” Machine Learning, vol. 6, pp. 161–182, 1991.

    Google Scholar 

  12. G.E. Hinton, “Connectionist learning procedures,” Artificial Intelligence, vol. 40, pp. 185–234, 1989.

    Google Scholar 

  13. H.M. Lee and C.S. Lai, “Supervised Fuzzy ART: Training of a neural network for pattern classification via combining supervised and unsupervised learning,” IEEE International Conference on Neural Networks, California, 1993, vol. 1, pp. 323–328.

    Google Scholar 

  14. R.P. Lippmann, “An introduction to computing with neural nets,” IEEE ASSP Magazine, vol. 4, pp. 4–22, 1987.

    Google Scholar 

  15. J. McClelland and D. Rumelhard, Explorations in Parallel Distributed Processing, Bradford Books/MIT Press: Cambridge, 1988.

    Google Scholar 

  16. M.T. Musavi, K.H. Chan, D.M. Hummels, K. Kalantri, and W. Ahmed, “A probabilistic model for evaluation of neural network classifiers,” Pattern Recognition, vol. 25, no. 10, pp. 1241–1251, 1992.

    Google Scholar 

  17. S. Salzberg, “A nearest hyperrectangle learning method,” Machine Learning, vol. 6, pp. 251–276, 1991.

    Google Scholar 

  18. J.W. Shavlik, R.J. Mooney, and G.G. Towell, “Symbolic and neural learning algorithms: An experimental comparison,” Machine Learning, vol. 6, pp. 111–143, 1991.

    Google Scholar 

  19. P.K. Simpson, “Fuzzy min-max neural networks,” International Joint Conference on Neural Networks, Singapore, 1991, pp. 1658–1669.

  20. P.K. Simpson, “Fuzzy min-max neural networks—Part 1: Classification,” IEEE Transactions on Neural Networks, vol. 3, pp. 776–786, 1992.

    Google Scholar 

  21. M. Wann, T. Hediger, and N.N. Greenbaun, “The influence of training sets on generalization,” International Joint Conference on Neural Networks, San Diego, 1990, vol. III, pp. 137–142.

    Google Scholar 

  22. S.M. Weiss and I. Kapouleas, “An empirical comparison of pattern recognition, neural nets, and machine learning classification methods,” Proceeding of Eleventh International Joint Conference on Artificial Intelligence, Detroit, MI, 1989, pp. 781–787.

  23. D.H. Wolpert, “Stacked generalization,” Neural Networks, vol. 5, pp. 241–259, 1992.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, HM., Lai, CS. Supervised extended ART: A fast neural network classifier trained by combining supervised and unsupervised learning. Appl Intell 6, 117–128 (1996). https://doi.org/10.1007/BF00117812

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00117812

Keywords

Navigation