Abstract
Symbolic regression is commonly performed using evolutionary algorithms like genetic programming (GP). The goal of this research work is to construct symbolic models from examples where a new symbolic regression approach based on artificial neural networks is proposed. This approach is composed of a long-term artificial neural network memory (LTANN-MEM), a working memory (WM) in addition to a proposed neural symbolization algorithm (NSA) which uses LTANN-MEM and WM for synthesizing symbolic models equivalent to learning examples. The proposed LTANN-MEM is composed of two separate multilayer perceptron (MLP) feed-forward neural networks as well as the working memory which is composed of a single MLP. The core idea of the proposed approach is based on memorizing the learning experience of individual perceptrons in long-term memory (LTM), so they become available to be reused in generating and developing hypotheses about the learning examples. Although this idea is generic and could be used for the purpose of symbolization in general, it is applied here in symbolic regression for Boolean domain only. The obtained results show the ability of the proposed approach to search the solutions space using learning experience stored previously in LTM to guide the search process. A comparison is done with GP and found that the proposed NSA algorithm outperforms GP in its performance when increasing the number of inputs and outputs in the same problem by comparing the number of emerged candidate solutions in both approaches.
Similar content being viewed by others
References
Koza J (1998) Genetic programming: on the programming of computers by means of natural selection. The MIT Press, Cambridge
Karaboga D, Ozturk C, Karaboga N, Gorkemli B (2012) Artificial bee colony programming for symbolic regression. Inf Sci 209:1–15
Saad EM, Hamdy A, Kamel A (2009) Evolving comprehensible neural network trees using genetic algorithms. Pattern Recognit Image Anal 19:1–6
Saad E, Hamdy A, Kamel A (2007) Evolving comprehensible neural network trees using genetic algorithms. In: The 8th international conference on pattern recognition and image analysis: new information technologies, IAPR, Yoshkar-Ola, Russian Federation
Oplatkova Z, Senkerik R (2011) Classification with pseudo neural networks based on evolutionary symbolic regression. P2P, parallel, grid, cloud and internet computing international conference
Abdullah W, Sathasivam S (2005) Logic mining using neural networks. In: Proceedings of the international conference on intelligent systems, Kuala Lumpur
Das S, Giles L, Sun G-Z (1992) Learning context-free grammars: capabilities and limitations of a recurrent neural network with an external stack memory. In: Proceedings of the fourteenth annual conference of cognitive science society
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780
Graves A, Wayne G, Danihelka I (2014) Neural turing machines. arXiv
Weston J, Chopra S, Bordes A (2015) Memory networks. New York, USA. arXiv
Ferreira C (2001) Gene expression programming: a new adaptive algorithm for solving problems. Complex Syst 13:87–129
Rainville D, Fortin F-A, Gardner M-A, Parizeau M, Gagne C (2012) Deap: a python framework for evolutionary algorithms. In: Proceedings of the fourteenth international conference on genetic and evolutionary computation conference companion
Rainville D, Fortin F-A, Gardner M-A, Parizeau M, Gagne C (2012) Deap: evolutionary algorithms made easy. J Mach Learn Res 13:2171–2175
Rick Riolo WPMK (2015) Genetic programming theory and practice XII. Springer International Publishing Switzerland, London
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Deklel, A.K., Hamdy, A.M. & Saad, E.M. Multi-objective symbolic regression using long-term artificial neural network memory (LTANN-MEM) and neural symbolization algorithm (NSA). Neural Comput & Applic 29, 935–942 (2018). https://doi.org/10.1007/s00521-016-2500-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-016-2500-8