Abstract
This paper presents a Metahybrid algorithm that consists of the dual combination of Wolf Search (WS) and Elman Recurrent Neural Network (ERNN). ERNN is one of the most efficient feed forward neural network learning algorithm. Since ERNN uses gradient descent technique during the training process; therefore, it is not devoid of local minima and slow convergence problem. This paper used a new metaheuristic search algorithm, called wolf search (WS) based on wolf’s predatory behavior to train the weights in ERNN to achieve faster convergence and to avoid the local minima. The performance of the proposed Metahybrid Wolf Search Elman Recurrent Neural Network (WRNN) is compared with Bat with back propagation (Bat-BP) algorithm and other hybrid variants on benchmark classification datasets. The simulation results show that the proposed Metahybrid WRNN algorithm has better performance in terms of CPU time, accuracy and MSE than the other algorithms.
Similar content being viewed by others
References
Nawi, N.M., Khan, A., Rehman, M.Z.: A new optimized Cuckoo Search Recurrent Neural Network (CSRNN) algorithm. In: The 8th International Conference on Robotic, Vision, Signal Processing and Power Applications, pp. 335–341. Springer, Singapore (2013)
Radhika, Y., Shashi, M.: Atmospheric temperature prediction using support vector machines. Int. J. Comput. Theory Eng. 1, 55–58 (2009)
Akcayol, M.A., Cinar, C.: Artificial neural network based modeling of heated catalytic converter performance. Appl. Therm. Eng. 25, 2341–2350 (2005)
Rehman, M.Z., Nawi, N.M.: Improving the accuracy of Gradient Descent Back Propagation algorithm (GDAM) on classification problems. Int. J. New Comput. Archit. Appl. 1, 861–870 (2011)
Kosko, B.: Neural Network and Fuzzy System. Prentice Hall, Upper Saddle River (1994)
Krasnopolsky, V.M., Chevallier, F.: Some neural network applications in environmental sciences. part II: advancing computational efficiency of environmental numerical models. Neural Netw. 16, 335–348 (2003)
Coppin, B.: Artificial Intelligence Illuminated. Jones and Bartlett Publishers Inc., Sudbury (2004)
Basheer, I.A., Hajmeer, M.: Artificial neural networks: fundamentals, computing, design, and application. J. Microbiol. Methods 43, 3–31 (2000)
He, Z., Wu, M., Gong, B.: Neural network and its application on machinery fault diagnosis. In: IEEE International Conference on Systems Engineering. pp. 576–579 (1992)
Li, B., Chow, M.Y., Tipsuwan, Y., Hung, J.C.: Neural-network-based motor rolling bearing fault diagnosis. IEEE Trans. Ind. Electron. 47, 1060–1069 (2000)
Nawi, N.M., Khan, A., Rehman, M.Z.: CSBPRNN: a new hybridization technique using cuckoo search to train back propagation recurrent neural network. In: Herawan, T., Deris, M.M., Abawajy, J. (eds.) Proceedings of the First International Conference on Advanced Data and Information Engineering (DaEng-2013). LNEE, vol. 285, pp. 111–118. Springer, Singapore (2014). doi:10.1007/978-981-4585-18-7_13
Zhang, J., Lok, T., Lyu, M.R.: A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training. Appl. Math. Comput. 185, 1026–1037 (2007)
Ab Aziz, M.F., Hj Shamsuddin, S.M., Alwee, R.: Enhancement of particle swarm optimization in elman recurrent network with bounded Vmax function. In: Proceedings 2009 3rd Asia International Conference on Modelling and Simulation, AMS 2009, pp. 125–130 (2009)
Sutskever, I., Hinton, G., Taylor, G.: The Recurrent temporal restricted Boltzmann machine. Neural Inf. Process. Syst. 21, 1601–1608 (2008)
Gupta, L., McAvoy, M., Phegley, J.: Classification of temporal sequences via prediction using the simple recurrent neural network. Pattern Recognit. 33, 1759–1770 (2000)
Saad, E.W., Prokhorov II, D., Donald, C.W.: Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks. IEEE Trans. Neural Netw. 9, 1456–1470 (1998)
Guo, L., Rivero, D., Pazos, A.: Epileptic seizure detection using multiwavelet transform based approximate entropy and artificial neural networks. J. Neurosci. Methods 193, 156–163 (2010)
Güler, N.F., Übeyli, E.D., Güler, I.: Recurrent neural networks employing Lyapunov exponents for EEG signals classification. Expert Syst. Appl. 29, 506–514 (2005)
Übeyli, E.D.: Recurrent neural networks employing Lyapunov exponents for analysis of doppler ultrasound signals. Expert Syst. Appl. 34, 2538–2544 (2008)
Karaboga, D., Akay, B., Ozturk, C.: Artificial Bee Colony (ABC) optimization algorithm for training feed-forward neural networks. In: Torra, V., Narukawa, Y., Yoshida, Y. (eds.) MDAI 2007. LNCS (LNAI), vol. 4617. Springer, Heidelberg (2007). doi:10.1007/978-3-540-73729-2
Karaboga, D., Akay, B.: A comparative study of Artificial Bee Colony algorithm. Appl. Math. Comput. 214, 108–132 (2009)
Nawi, N.M., Rehman, M.Z., Khan, A.: A new Bat Based Back-Propagation (BAT-BP) algorithm. In: Swiątek, J., Grzech, A., Swiątek, P., Tomczak, J.M. (eds.) Advances in Systems Science, pp. 395–404. Springer, Cham (2014)
Tang, R., Fong, S., Yang, X.-S., Deb, S.: Wolf search algorithm with ephemeral memory. In: Seventh International Conference on Digital Information Management (ICDIM 2012), pp. 165–172 (2012)
Wolberg, W.H., Mangasarian, O.L.: Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc. Natl. Acad. Sci. U.S.A. 87, 9193–9196 (1990)
Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1, 81–106 (1986)
Fisher, R.: The use of multiple measurements in taxonomic problems. Ann. Eugen. 7, 179–188 (1936)
Quinlan, J.R.: Simplifying decision trees. Int. J. Man-Mach. Stud. Spec. Issue: Knowl. Acquisition Knowl.-Based Syst. Part 5 27(3), 221–234 (1987)
Acknowledgments
The Authors would like to thank Office of Research, Innovation, Commercialization and Consultancy (ORICC), Universiti Tun Hussein Onn Malaysia (UTHM) and Ministry of Education (MOE) Malaysia for financially supporting this Research under Fundamental Research Grant Scheme (FRGS) vote no. 1236. This research is also supported by Gates IT Solution Sdn. Bhd under its publication scheme.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Nawi, N.M., Rehman, M.Z., Hamid, N.A., Khan, A., Naseem, R., Uddin, J. (2017). Optimizing Weights in Elman Recurrent Neural Networks with Wolf Search Algorithm. In: Herawan, T., Ghazali, R., Nawi, N.M., Deris, M.M. (eds) Recent Advances on Soft Computing and Data Mining. SCDM 2016. Advances in Intelligent Systems and Computing, vol 549. Springer, Cham. https://doi.org/10.1007/978-3-319-51281-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-51281-5_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-51279-2
Online ISBN: 978-3-319-51281-5
eBook Packages: EngineeringEngineering (R0)