Abstract:
Single-layer feedforward networks (SLFNs) have been proven to be a universal approximator when all the parameters are allowed to be adjustable. It is widely used in class...Show MoreMetadata
Abstract:
Single-layer feedforward networks (SLFNs) have been proven to be a universal approximator when all the parameters are allowed to be adjustable. It is widely used in classification and regression problems. The SLFN learning involves two tasks: determining network size and training the parameters. Most current algorithms could not be satisfactory to both sides. Some algorithms focused on construction and only tuned part of the parameters, which may not be able to achieve a compact network. Other gradient-based optimization algorithms focused on parameters tuning while the network size has to be preset by the user. Therefore, trial-and-error approach has to be used to search the optimal network size. Because results of each trial cannot be reused in another trial, it costs much computation. In this paper, a hybrid constructive (HC)algorithm is proposed for SLFN learning, which can train all the parameters and determine the network size simultaneously. At first, by combining Levenberg-Marquardt algorithm and least-square method, a hybrid algorithm is presented for training SLFN with fixed network size. Then,with the hybrid algorithm, an incremental constructive scheme is proposed. A new randomly initialized neuron is added each time when the training entrapped into local minima. Because the training continued on previous results after adding new neurons, the proposed HC algorithm works efficiently. Several practical problems were given for comparison with other popular algorithms. The experimental results demonstrated that the HC algorithm worked more efficiently than those optimization methods with trial and error, and could achieve much more compact SLFN than those construction algorithms.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 26, Issue: 8, August 2015)