Abstract
We propose the approximation of \(\tanh\) (i.e. the hyperbolic tangent) by specific formation of cubic splines. Thus, we save many multiplications and a division required for the standard double precision evaluation of this function. The cost we have to pay is to admit at most 2–4 decimal digits of accuracy in the final approximation. As a result, a speeding in neural networks performance is experienced after implementing this new approximant as transfer function.
Similar content being viewed by others
References
Tsitouras C, Katsikis VN (2014) Bounds for variable degree rational \(L_\infty\) approximations to the matrix cosine. Comput Phys Commun 185:2834–2840
Tsitouras C, Famelis IT (2018) Bounds for variable degree rational \(L_\infty\) approximations to the matrix exponential. Appl Math Comput 338:376–386
Cahyadi TA, Syihab Z, Widodo LE, Notosiswoyo S, Widijanto E (2020) Analysis of hydraulic conductivity of fractured groundwater flow media using artificial neural network back propagation. Neural Comput Appl. https://doi.org/10.1007/s00521-020-04970-z
Selvanambi R, Natarajan J, Karuppiah M, Islam SKH, Hassan MM, Fortino G (2020) Lung cancer prediction using higher-order recurrent neural network based on glowworm swarm optimization. Neural Comput Appl. https://doi.org/10.1007/s00521-018-3824-3
Aquino G, de Jesús Rubio J, Pacheco J, Gutierrez GJ, Ochoa G, Balcazar R, Cruz DR, Garcia E, Novoa JF, Zacarias A (2020) Novel nonlinear hypothesis for the delta parallel robot modeling. IEEE Access 8:46324–46334
Ashfahani A, Pratama M, Lughofer E, Ong Y-S (2020) DEVDAN: Deep evolving denoising autoencoder. Neurocomputing 390:297–314
Meda-Campaña JA (2018) On the estimation and control of nonlinear systems with parametric uncertainties and noisy outputs. IEEE Access 6:31968–31973
Chiang H-S, Chen M-Y, Huang Y-J (2019) Wavelet-based EEG processing for epilepsy detection using fuzzy entropy and associative petri net. IEEE Access 7:103255–103262
Elias I, de Jesús Rubio J, Cruz DR, Ochoa G, Novoa JF, Martinez DI, Muñiz S, Balcazar R, Garcia E, Juarez CF (2020) Hessian with mini-batches for electrical demand prediction. Appl Sci 10:2036
de Jesús Rubio J (2009) SOFMLS: online self-organizing fuzzy modified least-squares network. IEEE Trans Fuzzy Syst 17:1296–1309
Cody WJ, Waite W (1980) Software manual for the elementary functions. Prentice-Hall inc., Englewood Cliffs
MATLAB version R2019b, The Mathworks, Inc., Natick, MA
Moler C Available from https://www.mathworks.com/matlabcentral/mlc-downloads/downloads/submissions/347/versions/2/previews/tanh_v5_2.m/index.html
Nair V, Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In: 27th International conference on machine learning. Haifa, Israel
Kiselak J, Lu Y, Svihra J, Szepe P, Stehlık M (2020) “SPOCU’’: scaled polynomial constant unit activation function. Neural Comput Appl. https://doi.org/10.1007/s00521-020-05182-1
Eckle K, Schmidt-Hieber J (2019) A comparison of deep networks with ReLU activation function and linear spline-type methods. Neural Netw 110:232–242
Vecci L, Piazza F, Uncini A (1998) Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Netw 11:259–270
Tsitoura AC, Tsitouras C (2020) Cubic spline approximation of transfer functions for speeding neural networks performances. AIP Conf Proc 2293:420015
Muller J-M (2016) Elementary functions. Algorithms and implementation, 3rd edn. Birkhäuser, New York
de Boor C (2001) A practical guide to splines, Rev. Springer, New York
Tsitoura AC (2020) M.Sc. Dissertation (in Greek), Dept. of Electronic and Electrical Enginr., Natl. Technical Univ. Athens, Available from http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/17559
Elliott DL (1993) A better activation function for artificial neural networks, ISR Technical Report TR 93-8, University of Maryland, College Park, MD 20742
Beale MH, Hagan MT, Demuth HB (2019b) Deep learning toolbox\(^{TM}\) users guide, version. The Mathworks, Inc., Natick
Beale MH, Hagan MT, Demuth HB (2019b) Deep learning toolbox\(^{TM}\) reference guide, version. The Mathworks, Inc., Natick
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
The authors have no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Simos, T.E., Tsitouras, C. Efficiently inaccurate approximation of hyperbolic tangent used as transfer function in artificial neural networks. Neural Comput & Applic 33, 10227–10233 (2021). https://doi.org/10.1007/s00521-021-05787-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-021-05787-0