Skip to main content

Advertisement

Log in

Multi-objective symbolic regression using long-term artificial neural network memory (LTANN-MEM) and neural symbolization algorithm (NSA)

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Symbolic regression is commonly performed using evolutionary algorithms like genetic programming (GP). The goal of this research work is to construct symbolic models from examples where a new symbolic regression approach based on artificial neural networks is proposed. This approach is composed of a long-term artificial neural network memory (LTANN-MEM), a working memory (WM) in addition to a proposed neural symbolization algorithm (NSA) which uses LTANN-MEM and WM for synthesizing symbolic models equivalent to learning examples. The proposed LTANN-MEM is composed of two separate multilayer perceptron (MLP) feed-forward neural networks as well as the working memory which is composed of a single MLP. The core idea of the proposed approach is based on memorizing the learning experience of individual perceptrons in long-term memory (LTM), so they become available to be reused in generating and developing hypotheses about the learning examples. Although this idea is generic and could be used for the purpose of symbolization in general, it is applied here in symbolic regression for Boolean domain only. The obtained results show the ability of the proposed approach to search the solutions space using learning experience stored previously in LTM to guide the search process. A comparison is done with GP and found that the proposed NSA algorithm outperforms GP in its performance when increasing the number of inputs and outputs in the same problem by comparing the number of emerged candidate solutions in both approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Koza J (1998) Genetic programming: on the programming of computers by means of natural selection. The MIT Press, Cambridge

    MATH  Google Scholar 

  2. Karaboga D, Ozturk C, Karaboga N, Gorkemli B (2012) Artificial bee colony programming for symbolic regression. Inf Sci 209:1–15

    Article  Google Scholar 

  3. Saad EM, Hamdy A, Kamel A (2009) Evolving comprehensible neural network trees using genetic algorithms. Pattern Recognit Image Anal 19:1–6

    Article  Google Scholar 

  4. Saad E, Hamdy A, Kamel A (2007) Evolving comprehensible neural network trees using genetic algorithms. In: The 8th international conference on pattern recognition and image analysis: new information technologies, IAPR, Yoshkar-Ola, Russian Federation

  5. Oplatkova Z, Senkerik R (2011) Classification with pseudo neural networks based on evolutionary symbolic regression. P2P, parallel, grid, cloud and internet computing international conference

  6. Abdullah W, Sathasivam S (2005) Logic mining using neural networks. In: Proceedings of the international conference on intelligent systems, Kuala Lumpur

  7. Das S, Giles L, Sun G-Z (1992) Learning context-free grammars: capabilities and limitations of a recurrent neural network with an external stack memory. In: Proceedings of the fourteenth annual conference of cognitive science society

  8. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  9. Graves A, Wayne G, Danihelka I (2014) Neural turing machines. arXiv

  10. Weston J, Chopra S, Bordes A (2015) Memory networks. New York, USA. arXiv

  11. Ferreira C (2001) Gene expression programming: a new adaptive algorithm for solving problems. Complex Syst 13:87–129

    MathSciNet  MATH  Google Scholar 

  12. Rainville D, Fortin F-A, Gardner M-A, Parizeau M, Gagne C (2012) Deap: a python framework for evolutionary algorithms. In: Proceedings of the fourteenth international conference on genetic and evolutionary computation conference companion

  13. Rainville D, Fortin F-A, Gardner M-A, Parizeau M, Gagne C (2012) Deap: evolutionary algorithms made easy. J Mach Learn Res 13:2171–2175

    MathSciNet  MATH  Google Scholar 

  14. Rick Riolo WPMK (2015) Genetic programming theory and practice XII. Springer International Publishing Switzerland, London

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. K. Deklel.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Deklel, A.K., Hamdy, A.M. & Saad, E.M. Multi-objective symbolic regression using long-term artificial neural network memory (LTANN-MEM) and neural symbolization algorithm (NSA). Neural Comput & Applic 29, 935–942 (2018). https://doi.org/10.1007/s00521-016-2500-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-016-2500-8

Keywords

Navigation