Elsevier

Neural Networks

Volume 16, Issues 5–6, June–July 2003, Pages 779-784
Neural Networks

2003 Special issue
Extension neural network and its applications

https://doi.org/10.1016/S0893-6080(03)00104-7Get rights and content

Abstract

In this paper, a novel extension neural network (ENN) is proposed. This new neural network is a combination of extension theory and neural network. It uses an extension distance (ED) to measure the similarity between data and cluster center. The learning speed of the proposed ENN is shown to be faster than the traditional neural networks and other fuzzy classification methods. Moreover, the new scheme has been proved to have high accuracy and less memory consumption. Experimental results from two different examples verify the effectiveness and applicability of the proposed work.

Introduction

Neural networks are parallel systems used for solving regression and classification problems in many fields (Rumelhart and McClelland, 1986, Kohonen, 1988, Specht, 1990, Carpenter et al., 1991). They can estimate a relation function between the inputs and outputs from a learning process, and also can discover the mapping form feature space into space of classes. Classification or cluster analysis is one of the most important applications of neural networks. The goal of classification is to partition a set of patterns into a group of desired subsets. There are many popular methods for applying neural networks to pattern recognition such as, multiplayer perceptrons (MLP) (Rumelhart & McClelland, 1986), Kohonen neural networks (KNN) (Kohonen, 1988), probabilistic neural network (PNN) (Specht, 1990), learning vector quantization (LVQ) (Bezdek & Pal, 1995), counter propagation networks (CPN) (Hecht-Nielaen, 1987), and adaptive resonance theory (ART) networks (Carpenter et al., 1991). There have been many successful applications in many fields.

The MLP is a continuous input and output pattern recognition, it experts in supervised learning. The most popular training method is error-back-propagation. The drawbacks are that it is not a good strategy to decide the number of neurons in hidden layers and it is time-consuming in training. The KNN is unsupervised training pattern recognition. It employs a winner-take-all learning strategy to store similar patterns in one neuron. KNN has good applications in phonetic or image pattern recognition, but it is not a good strategy to decide the learning parameters and the region of neighborhood. The ART network is an unsupervised learning and adaptive pattern recognition system. It can quickly and stably learn to categorize input patterns and permit an adaptive process for significant and new information. On the other hand, many methods have been proposed to design fuzzy classification systems for dealing with fuzzy classification problems (Hong and Chen, 1999, Hong and Chen, 2000, Wang et al., 1999, Yu and Chen, 2002). The fuzzy approaches can take human expertise, and have been successfully applied in this field. However, there are some intrinsic shortcomings, such as the difficulty of acquiring knowledge and maintaining a database.

In our world, there are some classification problems whose features are defined in a range. For example, boys can be defined as a cluster of men from age 1 to 14 and the permitted operation voltages of a specified motor may be between 100 and 120 V. For these problems, it is not easy to implement an appropriate classification method using current neural networks. Therefore, a new neural network topology, called the extension neural network (ENN) is proposed to solve these problems in this paper. In other words, the ENN permits classification of problems, which have range features, supervised learning, or continuous input and discrete output. This new neural network is a combination of the extension theory (Cai, 1983, Hong and Chen, 1999, Wang and Chen, 2001) and the neural network, the ENN uses a modified extension distance (ED) to measure the similarity between data and cluster center; it permits adaptive process for significant and new information, and gives shorter learning times than traditional neural networks. Moreover, this ENN has shown higher accuracy, less memory consumption in application.

Section snippets

Review of extension theory

Extension theory was originally invented by Cai to solve contradictions and incompatibility problems in 1983 (Cai, 1983). The extension set extends the fuzzy set from [0,1] to (−∞, ∞). As a result, it allows us to define a set that includes any data in the domain.

Extension neural network

The proposed ENN is a combination of the neural network and the extension theory. The extension theory proves a novel distance measurement for classification processes, and the neural network can embed the salient features of parallel computation power and learning capability. In other words, the ENN permits classification of problems, which have range features, supervised learning, or continuous input and discrete output.

Experimental results

In this paper, the Iris data classification problem (Chien, 1978) and vibration diagnosis problems (Li et al., 1999, Li et al., 2000) are used to illustrate the effectiveness of the proposed ENN.

Conclusions

This paper presents a novel ENN based on the extension theory and neural network. Compared with traditional neural networks and other fuzzy classification methods, it permits an adaptive process for significant and new information, and gives shorter learning times. The proposed ENN can solve some special classification problems that the feature is defined in a range. Moreover, the proposed ED, the different ranges of classical domain can arrive at different distances due to different

Acknowledgements

The authors gratefully acknowledge the support of the National Science Council, Taiwan, ROC, under the grant no. NSC-91-2213-E-167-014.

References (15)

There are more references available in the full text version of this article.

Cited by (0)

View full text