An experimental evaluation of neural networks for classification

https://doi.org/10.1016/0305-0548(93)90063-OGet rights and content

Abstract

Artificial neural networks are new methods for classification. In this paper, we describe how to build neural network models. These models are then compared with classical models such as linear discriminant analysis and quadratic discriminant analysis. While neural network models can solve some difficult classification problems where classical models cannot, the results show that even under best conditions for the classical models, neural networks are quite competitive. Furthermore, neural networks are more robust in that they are less sensitive to changes in sample size, number of groups, number of variables, proportions of group memberships, and degrees of overlap among groups.

References (43)

  • W.V. Gehrlein

    General mathematical programming formulations for the statistical classification problem

    Ops Res. Lett.

    (1986)
  • J.W. Denton et al.

    A neural network approach to the classification problem

    Expert Syst. Applic.

    (1990)
  • A.D. Gordon

    Classification

    (1981)
  • N. Capon

    Credit scoring systems: a critical analysis

    J. Market.

    (1982)
  • S. Sharma et al.

    Early warning indicators of business failures

    J. Market.

    (1980)
  • R.A. Walking

    Predicting tender offer success: a logistic analysis

    J. Finance Quant. Analysis

    (1985)
  • R.Y. Awh et al.

    A discriminant analysis of economic, demographic, and attitudinal characteristics of bank charge-card holders: a case study

    J. Finance

    (1983)
  • P.O. Duda et al.

    Pattern Classification and Scene Analysis

    (1973)
  • R.A. Fisher

    The statistical utilization of multiple measurements

    Annals Eugen.

    (1938)
  • C.A.B. Smith

    Some examples of discrimination

    Annals Eugen.

    (1946)
  • R. Lippmann

    An introduction to computing with neural nets

    IEEE ASSP Mag.

    (1987)
  • J.C. Hoskins et al.

    Incipient fault detection and diagnosis using artificial neural networks

  • K.I. Lang et al.

    Learning to tell two spirals apart

  • Y. Le Cun et al.

    Handwritten digit recognition with a back-propagation network

    Adv. Neural Inform. Process. Syst.

    (1990)
  • G.L. Martin et al.

    Recognizing hand-printed letters and digits

    Adv. Neural Inform. Process. Syst.

    (1990)
  • M.T. Leung et al.

    Fingerprint processing using backpropagation neural networks

  • D.B. Malkoff

    A neural network for real-time signal processing

    Adv. Neural Inform. Process. Syst.

    (1990)
  • T.J. Sejnowski et al.

    Combining visual and acoustic speech signals with a neural network improves intelligibility

  • P.M. Shea et al.

    Operational experience with a neural network in the detection of explosives in checked airline baggage

  • J.C. Singleton

    Neural nets for bond rating improved by multiple hidden layers

  • K.Y. Tam et al.

    Managerial applications of neural networks: the case of bank failure predictions

    Mgnt Sci.

    (1992)
  • Cited by (56)

    • MLR and ANN models of significant wave height on the west coast of India

      2012, Computers and Geosciences
      Citation Excerpt :

      They pointed out that ANN can provide a good alternative to statistical regression, time series analysis and numerical methods. Comparison studies between statistical methods and ANN have been carried out by many authors (Aslanargun et al., 2007; Kumar, 2005; Shuhui, Wunsch, Hair, and Giesselmann, 2001; Warner and Misra, 1996; Subramanian, Hung, and Hu, 1993; Patuwo, Hu, and Hung, 1993; Hruschka, 1993; Tam and Kiang, 1992; Salchenberger, Cinar, and Lash, 1992; Tam, 1991). Mahjoobi and Etemad-Shahidi, 2008, 2009 investigated the performances of classification and regression trees.

    • A tuning method for the architecture of neural network models incorporating GAM and GA as applied to bankruptcy prediction

      2012, Expert Systems with Applications
      Citation Excerpt :

      For example, Salchenberger et al. (1992) recommend that the number of hidden nodes should be 75% of the number of input variables. Subramanian, Ming, and Hu (1993) indicated that the number of nodes in a single hidden layer should range from the number of output nodes to the number of input variables plus one. Later, more systematic methods were suggested by Masters (1993) and Torsun (1996), who proposed a geometric progression rule.

    • A computational study on the performance of artificial neural networks under changing structural design and data distribution

      2002, European Journal of Operational Research
      Citation Excerpt :

      Our conjecture is open to future tests, however. The number of classification groups, together with the number of attributes and prediction variables, indicate a measure of task complexity [35]. The task complexity is a consideration in the choice of a discriminant analysis technique.

    View all citing articles on Scopus

    V. Subramanian received a Ph.D. in Business from Kent State University. His research interests include neural network theory and applications, object oriented systems, database theory and applications of AI to business.

    §

    His research interests include applications of neural networks, applied statistics, marketing research and international business.

    View full text