Skip to main content
Log in

Efficiently inaccurate approximation of hyperbolic tangent used as transfer function in artificial neural networks

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

We propose the approximation of \(\tanh\) (i.e. the hyperbolic tangent) by specific formation of cubic splines. Thus, we save many multiplications and a division required for the standard double precision evaluation of this function. The cost we have to pay is to admit at most 2–4 decimal digits of accuracy in the final approximation. As a result, a speeding in neural networks performance is experienced after implementing this new approximant as transfer function.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Tsitouras C, Katsikis VN (2014) Bounds for variable degree rational \(L_\infty\) approximations to the matrix cosine. Comput Phys Commun 185:2834–2840

    Article  MathSciNet  Google Scholar 

  2. Tsitouras C, Famelis IT (2018) Bounds for variable degree rational \(L_\infty\) approximations to the matrix exponential. Appl Math Comput 338:376–386

    MathSciNet  MATH  Google Scholar 

  3. Cahyadi TA, Syihab Z, Widodo LE, Notosiswoyo S, Widijanto E (2020) Analysis of hydraulic conductivity of fractured groundwater flow media using artificial neural network back propagation. Neural Comput Appl. https://doi.org/10.1007/s00521-020-04970-z

    Article  Google Scholar 

  4. Selvanambi R, Natarajan J, Karuppiah M, Islam SKH, Hassan MM, Fortino G (2020) Lung cancer prediction using higher-order recurrent neural network based on glowworm swarm optimization. Neural Comput Appl. https://doi.org/10.1007/s00521-018-3824-3

    Article  Google Scholar 

  5. Aquino G, de Jesús Rubio J, Pacheco J, Gutierrez GJ, Ochoa G, Balcazar R, Cruz DR, Garcia E, Novoa JF, Zacarias A (2020) Novel nonlinear hypothesis for the delta parallel robot modeling. IEEE Access 8:46324–46334

    Article  Google Scholar 

  6. Ashfahani A, Pratama M, Lughofer E, Ong Y-S (2020) DEVDAN: Deep evolving denoising autoencoder. Neurocomputing 390:297–314

    Article  Google Scholar 

  7. Meda-Campaña JA (2018) On the estimation and control of nonlinear systems with parametric uncertainties and noisy outputs. IEEE Access 6:31968–31973

    Article  Google Scholar 

  8. Chiang H-S, Chen M-Y, Huang Y-J (2019) Wavelet-based EEG processing for epilepsy detection using fuzzy entropy and associative petri net. IEEE Access 7:103255–103262

    Article  Google Scholar 

  9. Elias I, de Jesús Rubio J, Cruz DR, Ochoa G, Novoa JF, Martinez DI, Muñiz S, Balcazar R, Garcia E, Juarez CF (2020) Hessian with mini-batches for electrical demand prediction. Appl Sci 10:2036

    Article  Google Scholar 

  10. de Jesús Rubio J (2009) SOFMLS: online self-organizing fuzzy modified least-squares network. IEEE Trans Fuzzy Syst 17:1296–1309

    Article  Google Scholar 

  11. Cody WJ, Waite W (1980) Software manual for the elementary functions. Prentice-Hall inc., Englewood Cliffs

    MATH  Google Scholar 

  12. MATLAB version R2019b, The Mathworks, Inc., Natick, MA

  13. Moler C Available from https://www.mathworks.com/matlabcentral/mlc-downloads/downloads/submissions/347/versions/2/previews/tanh_v5_2.m/index.html

  14. Netlib http://www.netlib.org/fdlibm/s_tanh.c

  15. Nair V, Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In: 27th International conference on machine learning. Haifa, Israel

  16. Kiselak J, Lu Y, Svihra J, Szepe P, Stehlık M (2020) “SPOCU’’: scaled polynomial constant unit activation function. Neural Comput Appl. https://doi.org/10.1007/s00521-020-05182-1

    Article  Google Scholar 

  17. Eckle K, Schmidt-Hieber J (2019) A comparison of deep networks with ReLU activation function and linear spline-type methods. Neural Netw 110:232–242

    Article  Google Scholar 

  18. Vecci L, Piazza F, Uncini A (1998) Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Netw 11:259–270

    Article  Google Scholar 

  19. Tsitoura AC, Tsitouras C (2020) Cubic spline approximation of transfer functions for speeding neural networks performances. AIP Conf Proc 2293:420015

    Article  Google Scholar 

  20. Muller J-M (2016) Elementary functions. Algorithms and implementation, 3rd edn. Birkhäuser, New York

    Book  Google Scholar 

  21. de Boor C (2001) A practical guide to splines, Rev. Springer, New York

    MATH  Google Scholar 

  22. Tsitoura AC (2020) M.Sc. Dissertation (in Greek), Dept. of Electronic and Electrical Enginr., Natl. Technical Univ. Athens, Available from http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/17559

  23. Elliott DL (1993) A better activation function for artificial neural networks, ISR Technical Report TR 93-8, University of Maryland, College Park, MD 20742

  24. Beale MH, Hagan MT, Demuth HB (2019b) Deep learning toolbox\(^{TM}\) users guide, version. The Mathworks, Inc., Natick

    Google Scholar 

  25. Beale MH, Hagan MT, Demuth HB (2019b) Deep learning toolbox\(^{TM}\) reference guide, version. The Mathworks, Inc., Natick

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to T. E. Simos.

Ethics declarations

Conflicts of interest

The authors have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Simos, T.E., Tsitouras, C. Efficiently inaccurate approximation of hyperbolic tangent used as transfer function in artificial neural networks. Neural Comput & Applic 33, 10227–10233 (2021). https://doi.org/10.1007/s00521-021-05787-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-05787-0

Keywords

Navigation