Elsevier

Applied Soft Computing

Volume 9, Issue 1, January 2009, Pages 100-106
Applied Soft Computing

Using genetic algorithm to select the presentation order of training patterns that improves simplified fuzzy ARTMAP classification performance

https://doi.org/10.1016/j.asoc.2008.03.003Get rights and content

Abstract

The presentation order of training patterns to a simplified fuzzy ARTMAP (SFAM) neural network affects the classification performance. The common method to solve this problem is to use several simulations with training patterns presented in random order, where voting strategy is used to compute the final performance. Recently, an ordering method based on min–max clustering was introduced to select the presentation order of training patterns based on a single simulation. In this paper, another single simulation method based on genetic algorithm is proposed to obtain the presentation order of training patterns for improving the performance of SFAM. The proposed method is applied to a 40-class individual classification problem using visual evoked potential signals and three other datasets from UCI repository. The proposed method has the advantages of improved classification performance, smaller network size and lower training time compared to the random ordering and min–max methods. When compared to the random ordering method, the new ordering scheme has the additional advantage of requiring only a single simulation. As the proposed method is general, it can also be applied to a fuzzy ARTMAP neural network when it is used as a classifier.

Introduction

Fuzzy ARTMAP (FAM) [1] and simplified fuzzy ARTMAP (SFAM) [2] belong to a special class of neural networks (NNs) which are capable of incremental learning. In the fast learning mode, these networks have lower training time compared to other NN architectures like Multilayer Perceptron. SFAM and FAM have been used in numerous classification problems [1], [2], [3], [4], [5], [6]. FAM structure has three modules: Fuzzy ARTa, Fuzzy ARTb and Inter ART. SFAM differs from FAM in that its main purpose is for classification and as such, does not have Fuzzy ARTb, which becomes redundant for this purpose.

The presentation order of training patterns affects the classification (i.e. generalisation) performance of SFAM. To solve this problem, SFAM is trained several times using training patterns presented in random order (i.e. permutations of the training patterns) and then the predicted class of the test patterns are stored. Majority votes are used to determine the final class prediction for the test patterns [1]. It is also customary to state the average classification of test patterns from all the simulations in addition to the voting results. To solve the problem of having to run many simulations, a single simulation method based on min–max clustering was proposed [3]. For a c-class problem, the method works by ordering the c training patterns that are maximally distant in the training feature space. Next, for the rest of the patterns, the method orders training patterns that are minimally distant from these c patterns. Hence, it is known as min–max ordering.

In this paper, a method that uses genetic algorithm (GA) [7] to select the presentation order of training patterns is proposed. The method works by using the selection, mutation and inversion operators in GA to select the presentation order of training patterns that maximises the SFAM classification performance. Once the order is selected, only a single SFAM training simulation (similar to min–max ordering) will be required for classification of test patterns. The performance of the proposed technique is compared with training patterns ordered by min–max and random ordering using classifications of visual evoked potential (VEP) patterns to identify individuals [8]. In addition, three data sets from UCI repository [9] are also used to measure the performances of these ordering methods. From this point onwards, these three methods will be denoted simply as GA method, random ordering method and min–max method. All the discussions in this paper on SFAM could be equally applied to FAM with the condition that FAM is used as a classifier.

Section snippets

Simplified fuzzy ARTMAP

Fig. 1 shows the architecture of SFAM. It consists of two modules (Fuzzy ART and Inter ART) that create stable recognition categories in response to sequence of input patterns. During supervised learning, Fuzzy ART receives a stream of input features representing the pattern that map to the output classes in the category layer. Inter ART module works by increasing the vigilance parameter (VP) of Fuzzy ART by a minimal amount to correct a predictive error in the output category layer. VP

Pattern ordering using GA method

The overall methodology could be divided into two separate phases. In the first phase, either GA or min–max (as comparison) was used to order the input training patterns. The second phase involved testing the performance of SFAM for all the ordering methods (i.e. our proposed method, min–max and random). This second phase was conducted to show the improvement in SFAM performances when trained by training patterns ordered by GA as compared to min–max and random ordering.

Initially, the available

Experimental study

An experimental study is conducted to show the superior performance of the GA method compared to the random ordering and min–max methods. For this purpose, the dataset used in an earlier work to identify individuals [8] was used. In addition, three datasets, namely, wine, glass and iris, from UCI repository [9] were also used.

Conclusion

This paper has proposed the use of GA to select the presentation order of training patterns for SFAM. The new method could also be applied to FAM. The performances of the proposed method have been compared with the performances of the random ordering with a voting strategy and the min–max method for solving an individual classification problem using VEP signals and three data sets from UCI repository. Though there are computational overheads for the proposed method, it will be only during the

Acknowledgment

We thank Prof. Henri Begleiter at the Neurodynamics Laboratory at the State University of New York Health Centre at Brooklyn, USA who generated the raw VEP data and Mr. Paul Conlon, of Sasco Hill Research, USA for sending us the database. A part of this work was funded by University of Essex Research Promotion Fund (DDQP40).

References (10)

  • P. Raveendran et al.

    Fuzzy ARTMAP classification of invariant features derived using angle of rotation from a neural network

    Inf. Sci.: Int. J.

    (2000)
  • G.A. Carpenter et al.

    Fuzzy ARTMAP: a neural network architecture for incremental supervised learning of analog multidimensional maps

    IEEE Trans. Neural Netw.

    (1992)
  • T. Kasuba

    Simplified fuzzy ARTMAP

    AI Expert

    (1993)
  • I. Dagher et al.

    An ordering algorithm for pattern presentation in Fuzzy ARTMAP that tends to improve generalization performance

    IEEE Trans. Neural Netw.

    (1999)
  • R. Palaniappan et al.

    A new brain-computer interface design using fuzzy ARTMAP

    IEEE Trans. Neural Syst. Rehabil. Eng.

    (2002)
There are more references available in the full text version of this article.

Cited by (45)

  • A survey of adaptive resonance theory neural network models for engineering applications

    2019, Neural Networks
    Citation Excerpt :

    Many approaches have been developed to mitigate such ordering effects, and they mostly consist of suitable pre- and post-processing strategies (cf. Brito da Silva and Wunsch II (2018) and the references cited within). Particularly, for supervised ART models, these strategies include Max–Min clustering (Tou & Gonzalez, 1974) in Dagher, Georgiopoulos, Heileman, and Bebis (1998) and Dagher et al. (1999); class-by-class presentation in Sit, Mak, and Ng (2009), genetic algorithms (Eiben & Smith, 2015) in Baek, Lee, Lee, Lee, and Kim (2014) and Palaniappan and Eswaran (2009); uncorrelated feature-based ordering in Oong and Isa (2014); featural biasing in Carpenter and Gaddam (2010); and voting strategies in Amis and Carpenter (2007, 2010), Carpenter (2003), Carpenter and Markuzon (1998), Carpenter et al. (1992), Lim and Harrison (2000a, 2000b) and Williamson (1996). In regard to unsupervised ART models, examples of strategies are split, merge and delete operations in Lughofer (2008); merging methods in Brito da Silva et al. (2020) and Isawa et al. (2008a, 2008b, 2009); cluster validity index-based vigilance tests in Brito da Silva and Wunsch II (2017); learning topologies in Masuyama et al. (2019) and Tscherepanow (2010, 2012); and exploiting the ordering properties of visual assessment of cluster tendency (VAT) (Bezdek, 2017; Bezdek & Hathaway, 2002) in Brito da Silva and Wunsch II (2018).

  • A selective fuzzy ARTMAP ensemble and its application to the fault diagnosis of rolling element bearing

    2016, Neurocomputing
    Citation Excerpt :

    But, in reality, when a single FAM is applied to the classification fields, the classification accuracy is not much satisfactory due to the limitation of FAM, such as the effect of the order of sample presentation for the off-line mode of training [15], overlapping effect extended from node expansion [14,16] and etc. To solve the problem many researchers have proposed some preprocessing procedures, known as the ordering algorithms such as min-max clustering and genetic algorithm, to improve the classification accuracy of single FAM classifier[17,18], but these ordering methods can not consider the effect of the number of training samples, and their reliability of classification is not high. Afterwards, the ensemble of multiple neural network classifiers based on the decision fusion technique has been presented to overcome these limitations of individual FAM classifier and achieve higher accuracy and reliability [19,20].

  • An efficient genetic selection of the presentation order in simplified fuzzy ARTMAP patterns

    2014, Applied Soft Computing Journal
    Citation Excerpt :

    Fig. 4 shows the average learning results of the ten runs in all of the databases. The dashed lines in the figure correspond to the results of the conventional method [7], while the solid lines correspond to the results of the proposed method. The vertical axis denotes the fitness value of (21), and the horizontal axis denotes the number of generations.

View all citing articles on Scopus
View full text