An improved neural classification network for the two-group problem

https://doi.org/10.1016/S0305-0548(98)00076-8Get rights and content

Abstract

In this paper we present the neural network model known as the mixture-of-experts (MOE) and determine its accuracy and its robustness. We do this by comparing the classification accuracy of MOE, backpropagation neural network (BPN), Fisher’s discriminant analysis, logistics regression, k nearest neighbor, and the kernel density on five real-world two-group data sets. Our results lead to three major conclusions: (1) the MOE network architecture is more accurate than BPN; (2) MOE tends to be more accurate than the parametric and non-parametric methods investigated; (3) MOE is a far more robust classifier than the other methods for the two-group problem.

Scope and purpose

The two-group classification problem is the assignment of objects to one of two predetermined groups. These classification decisions are routinely made in business, health care, and government. Classification errors result when an object is assigned to the wrong group. A healthy patient might be diagnosed with cancer, or a patient suffering from cancer might be diagnosed as healthy. A firm assigned the wrong bond rating will incur additional cost of capital of millions of dollars. Unfortunately, there is no classification method that is the “best” for all data sets. Some classification methods will do well for a given data set only to perform poorly on another. What is needed, therefore, is a robust methodology that classifies accurately across a wide range of two-group data sets. Researchers and practitioners have turned to neural classification models in a quest for a robust method. The purpose of this research is to investigate the accuracy and robust nature of the mixture-of-experts neural network (MOE). Our research demonstrates that MOE performs very well against back-propagation neural networks and four traditional statistical classification procedures based upon results from five diverse real world data sets. We conclude that the mixture-of-experts neural network may be the robust classification scheme being sought for use with the two-group problem.

References (0)

Cited by (28)

  • Comparing performances of backpropagation and genetic algorithms in the data classification

    2011, Expert Systems with Applications
    Citation Excerpt :

    In these mathematical programming techniques, minimization of the sum of deviations approaches, maximum of the minimum deviation approaches, goal programming approaches, mixed-integer programming approaches and hyper-box representation approaches are the most commonly used optimization approaches. Artificial neural networks (ANNs) have popularity in solving several business and technical problems that involve prediction, and have also a wide ranging usage area in the classification problems (Denton, Hung, & Osyk, 1990; Holmstrom, Koistinen, Laaksonen, & Oja, 1997; Mangiameli & West, 1999; Patwo, Hu, & Hung, 1993; Pendharkar, 2005; Yim & Mitchell, 2005). One of the important issues on the neural networks is training of the networks.

  • Statistics over features for internal carotid arterial disorders detection

    2008, Computers in Biology and Medicine
    Citation Excerpt :

    The outputs of expert networks are combined by a gating network simultaneously trained in order to stochastically select the expert that is performing the best at solving the problem [2,3]. As pointed out by Jordan and Jacobs [4], the gating network performs a typical multiclass classification task [5–7]. Although the ME architecture has been successfully applied to several supervised learning tasks, it can only use a composite feature for classification with diverse features, since both gating and expert networks need to receive the same input.

View all citing articles on Scopus
1

Paul Mangiameli is a Professor of Management Science and Information Systems in the College of Business Administration of the University of Rhode Island. His research interests are in neural network applications, process control, and quality management. He has published extensively in such journals as Decision Sciences, Journal of Operations Management, European Journal of Operations Research, Omega – The International Journal of Management Science, Managerial and Decision Economics, International Review of Economics and Finance, and Interfaces.

2

David West is an Associate Professor of Decision Sciences at East Carolina University in Greenville, North Carolina. He received his Ph.D. in Business Administration from the University of Rhode Island. His research interests include the application of neural network technology to such areas as classification decisions, manufacturing process control, and group clustering. He has published in the European Journal of Operations Research and Omega – The International Journal of Management Science.

View full text