An efficient self-organizing RBF neural network for water quality prediction
Introduction
Predicting water quality in the wastewater treatment process can provide a basis for water treatment plant management decisions that can minimize microbial risks and optimize the treatment operation (Ge & Frick, 2009). In many practical situations, however, it is difficult to predict accurately the quality of water in the treatment process due to a lack of knowledge of the parameters used in the process, or the presence of disturbances in the system. Thus, the predictor should take an appropriate action to counteract the presence of disturbances to which the system is subjected and should be able to adjust itself to the changing dynamics of the system. Radial basis function (RBF) neural networks have been successfully applied for solving dynamic system problems, because they can predict the behavior directly from input/output data (Ferrari et al., 2010, Lee and Ko, 2009, Wang and Yu, 2008). However, the number of hidden neurons in these RBF networks (Ferrari et al., 2010, Lee and Ko, 2009, Wang and Yu, 2008) is often assumed to be constant. In fact, if the number of hidden neurons is too large, the computational loading is heavy and, in general, the performance is poor. On the other hand, if the number of hidden neurons is too small, the learning performance may not be good enough to achieve the desired performance. For this reason, it is crucial to optimize the structure of RBF networks to improve performance. A brief review of the existing algorithms is given below.
The resource-allocating network (RAN), proposed by Platt (1991), was the first dynamic structure RBF neural network model. In the training procedure, new Gaussian neurons can be inserted into the hidden layer. Although the growing strategy usually results in a smaller network size than a fixed-size neural network, the RAN may become extremely large because insignificant hidden neurons are not pruned. Yingwei et al. introduced a strategy to prune hidden neurons, whose contribution is relatively small and incorporated this pruning strategy into the RAN, combining it with Extended Kalman Filter (EKF) (Li, Sundararajan, & Saratchandran, 1997). This network is referred to as the minimal resource-allocating network (MRAN); its applications are described in Li, Sundararajan, and Saratchandran (1998). MRAN is a popular tool used to design optimal RBF structures (Panchapakesan, Palaniswami, Ralph, & Manzie, 2002). However, the initial parameters of the new hidden neurons were not considered and the convergence of MRAN was not discussed (Lian, Lee, Sudhoff, & Stanislaw, 2008).
The self-organizing design of RBF neural networks has been discussed by several authors. Huang, Saratchandran, and Sundararajan (2004) proposed a simple sequential learning algorithm for RBF neural networks, which is referred to as the RBF growing and pruning algorithm (GAP-RBF). The original design of GAP-RBF was enhanced to produce a more advanced model known as GGAP-RBF (Huang, Saratchandran, & Sundararajan, 2005). Both GAP-RBF and GGAP-RBF neural networks use a pruning and growing strategy which is based on the “significance” of a neuron and links it to the learning accuracy. The structure of this RBF neural network is simple and the computational time is less than the time used by a conventional RBF. However, when used in practice, GAP-RBF and GGAP-RBF require a complete set of training samples for the training process. In general, it is not possible for designers and technical domain experts to have a priori knowledge of the training samples prior to implementation. Recently, methods based on genetic algorithms (GA) have been used to change the number of hidden neurons (Gonzalez et al., 2003, Wu and Chow, 2007). The great advantage, in theory, of a GA is its ability to do global searching, but this comes at the cost of increased computation requirements. Feng (2006) proposed a self-organizing RBF neural network model based on a particle swarm optimization (PSO) method which attempts to solve the training time issue. The PSO method is used to construct the structure of the RBF neural network in order to simplify and speed up optimization. However, because the PSO method is a population-based evolutionary computation technique, the training time is still too long.
In this paper, a new algorithm, which is called the flexible structure RBF neural network (FS-RBFNN) is presented. This algorithm has several advantages: firstly, a neuron’s average firing rate is used to determine whether new neurons should be inserted. The rate of firing used in the FS-RBFNN is similar to the spiking frequency of the presynaptic neuron in the biological neural system (Neves, Cooke, & Bliss, 2008). When the rate value of the hidden neuron is bigger than a given threshold value, new neurons will be inserted into the hidden layer.
Secondly, the connectivity of hidden neurons is estimated using an information-theoretic methodology. The connectivity between hidden neurons is obtained by measuring the mutual information (MI) (Krivov, Ulanowicz, & Dahiya, 2003) in the training process.
Thirdly, the convergence of the FS-RBFNN is analyzed both theoretically and experimentally. However, most of the self-organizing methods used for RBF structure design (Alexandridis et al., 2003, Bortman and Aladjem, 2009, Shi et al., 2005) only analyze the convergence for learning process by experiments. In order to work in practice it is essential that the FS-RBFNN training process must be convergent. The FS-RBFNN has been specifically designed with this in mind.
Fourthly, the FS-RBFNN is particularly focused on reducing the retraining epochs after the structure has been modified by growing and pruning. The error required (ER) method is used to determine the initial values of the neurons inserted into the FS-RBFNN. It is well known that when a design algorithm adds (or prunes) hidden neurons to (or from) an existing RBF structure, it retrains the modified structure to adapt their connection weights.
The outline of this paper is as follows. Section 2 describes how the MI and the average firing rate are used to design the RBF; it also introduces the FS-RBFNN. Section 3 discusses and analyzes the algorithm. Section 4 presents experimental results which compare the performance of the algorithm with other similar algorithms. Section 5 concludes the paper.
Section snippets
Flexible structure radial basis function neural network (FS-RBFNN)
An RBF neural network has a simple neural network structure in terms of the direction of information flow. Since the performance of an RBF neural network is heavily dependent on its architecture, research has focused on self-organizing methods that can be used to design the architecture of three-layered RBF neural networks.
In order to design the structure of the RBF neural network automatically a dynamic tuning strategy is used in the FS-RBFNN. This strategy changes the topology of the RBF
Convergence discussion
For our proposed FS-RBFNN, the convergence of the algorithm with respect to the topology adjusting is an important issue and needs careful investigation. This is crucial for the successful applications. First, the convergence property of the case without structure changing is determined. Secondly, we investigate the convergence in the structure changing phase, the convergence of the constructive training algorithm will be considered. Furthermore, through this analysis one obtains a better
Experimental studies
The performance of the FS-RBFNN was verified by applying it to a nonlinear dynamic system: predicting water quality in the wastewater treatment process. The performance of the algorithm was evaluated by comparing the results with other similar self-organizing RBFNNs. All the simulations are programmed with Matlab version 7.01, and were run on a Pentium 4 with a clock speed of 2.6 GHz and 1 GB of RAM, under a Microsoft Windows XP environment.
In the following simulations the learning parameters
Conclusion
In this paper, a new algorithm is proposed for creating a self-organizing RBFNN whose architecture is automatically adapted based on neuron activity and mutual information (MI). The advantages of the proposed approach are that it can simplify and accelerate the structure optimization process of the RBFNN, and can solve practical problems for predicting water quality in the wastewater treatment process. The effectiveness and performance of the method are firstly demonstrated by using an example
Acknowledgments
The authors would like to thank Dr. R. Dale-Jones for reading the manuscript and providing valuable comments. The authors would also like to thank the anonymous reviewers for their valuable comments and suggestions, which helped improve this paper greatly. This work was supported by the National 863 Scheme Foundation of China under Grants 2009AA04Z155 and 2007AA04Z160, National Science Foundation of China under Grants 61034008 and 60873043, Ph.D. Program Foundation from Ministry of Chinese
References (27)
- et al.
A new algorithm for online structure and parameter adaptation of RBF networks
Neural Networks
(2003) - et al.
Backfilling missing microbial concentrations in a riverine database using artificial neural networks
Water Reasearch
(2007) Self-generation RBFNs using evolutional PSO learning
Neurocomputing
(2006)- et al.
Quantitative measures of organization for multiagent systems
Biosystems
(2003) - et al.
Time series prediction using RBF neural networks with a nonlinear time-varying evolution PSO algorithm
Neurocomputing
(2009) - et al.
Identification and prediction using recurrent compensatory neuro-fuzzy systems
Fuzzy Sets and Systems
(2005) - et al.
Water quality modeling for load reduction under uncertainty: a Bayesian approach
Water Reasearch
(2008) - et al.
Sensitivity analysis applied to the construction of radial basis function networks
Neural Networks
(2005) - et al.
Adaptive RBF network for parameter estimation and stable air–fuel ratio control
Neural Networks
(2008) - Asuncion, A., & Newman, D.J. (2007). UCI Machine Learning Repository, [Online]. Available...
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
A model-based approach to predicting BOD5 in settled sewage
Water Science and Technology
A hierarchical RBF online learning algorithm for real-time 3-D scanner
IEEE Transactions on Neural Networks
Cited by (207)
Deep learning based soft sensor for microbial wastewater treatment efficiency prediction
2023, Journal of Water Process EngineeringTime-series prediction using a regularized self-organizing long short-term memory neural network
2023, Applied Soft ComputingOptimal control of sewage treatment process using a dynamic multi-objective particle swarm optimization based on crowding distance
2023, Journal of Environmental Chemical EngineeringGraph convolutional network soft sensor for process quality prediction
2023, Journal of Process ControlDA-Bi-SRU for water quality prediction in smart mariculture
2022, Computers and Electronics in AgricultureCitation Excerpt :Han. ( Han et al., 2011) built a model based on an optimized Radial Basis Function with Flexible Structures neural network to predict water body data. Wang (Wang et al., 2017) et al. used LSTM-based neural network to predict water quality.
- 1
Fax: +86 10 67391631.