Skip to main content
Log in

Deep delay rectified neural networks

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

An activation function is one of the key factors for the success in deep learning. According to the neurobiology research, biological neurons don’t respond to external stimuli in the initial stage and then respond to the stimulus intensity while it reaches a certain value. However, the rectified linear unit (ReLU) series activation functions, such as ReLU, LReLU, PReLU, ELU, SReLU, and MPELU, cannot meet the response characteristics of biological neurons. To address this problem, a delay rectified linear unit (DRLU) activation function with the excitation response threshold is proposed, based on the ReLU activation function. The DRLU activation function is more consistent with the response characteristics of biological neurons and more flexible compared with the ReLU activation function. The experimental results show that the DRLU activation function has better performance than the ReLU activation function in accuracy, training time, and convergence on different datasets, such as MNIST, Fashion-MNIST, SVHN, CALTECH101, and FLOWER102. The DRLU activation function also provides viewpoints and references to the excitation response threshold of LReLU, PReLU, ELU, SReLU, and MPELU.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507. https://doi.org/10.1126/science.1127647

    Article  MathSciNet  MATH  Google Scholar 

  2. Moon J, Hossain MB, Chon KH (2021) Ar and arma model order selection for time-series modeling with imagenet classification. Signal Process 183(10):108026. https://doi.org/10.1016/j.sigpro.2021.108026

    Article  Google Scholar 

  3. Shashidhar R, Patilkulkarni S (2021) Visual speech recognition for small scale dataset using vgg16 convolution neural network. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-11119-0

    Article  Google Scholar 

  4. Patil SP, Jariwala KN (2021) Improving the efficiency of image and video forgery detection using hybrid convolutional neural networks. Int J Uncertain Fuzziness Knowl-Based Syst. https://doi.org/10.1142/S0218488521400067

    Article  Google Scholar 

  5. Lafarge MW, Bekkers EJ, Pluim J (2021) Roto-translation equivariant convolutional networks: Application to histopathology image analysis. Med Image Anal 68:101849. https://doi.org/10.1016/j.media.2020.101849

    Article  Google Scholar 

  6. Wali A, Alamgir Z, Karim S, Fawaz A, Ali MB, Adan M, Mujtaba M (2022) Generative adversarial networks for speech processing: a review. Comput Speech Lang 72:101308. https://doi.org/10.1016/j.csl.2021.101308

    Article  Google Scholar 

  7. Han J, Moraga C (1995) The influence of the sigmoid function parameters on the speed of back-propagation learning. In: International Workshop on Artificial Neural Networks. Springer, Berlin, Heidelberg, pp 195–201

  8. Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, 249–256

  9. Chatterjee A, Gupta U, Chinnakotla MK, Srikanth R, Galley M, Agrawal P (2019) Understanding emotions in text using deep learning and big data. Comput Hum Behav 93:309–317. https://doi.org/10.1016/j.chb.2018.12.029

    Article  Google Scholar 

  10. Nair V, Hinton GE (2010) Rectified linear units improving restricted boltzmann machines. In: Proceedings of the 27th International Conference on Machine Learning (ICML-10), Haifa, Israel: DBLP, 807–814

  11. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, 1097–1105

  12. Maas AL, Hannun AY, Ng AY (2013) Rectifier nonlinearities improve neural network acoustic models. Proc. Icml. 30(1):3–8

    Google Scholar 

  13. He K, Zhang X, Ren, Sun J (2015) Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE International Conference on Computer Vision, pp 1026–1034

  14. Clevert DA, Unterthiner T, Hochreiter S (2015) Fast and accurate deep network learning by exponential linear units (elus). ICLR

  15. Li Y, Fan C, Li Y, Wu Q, Ming Y (2018) Improving deep neural network with multiple parametric exponential linear units. Neurocomputing 301:11–24. https://doi.org/10.1016/j.neucom.2018.01.084

    Article  Google Scholar 

  16. Jin X, Xu C, Feng J, Wei Y, Xiong J, Yan S (2016) Deep learning with s-shaped rectified linear activation units. AAAI. https://doi.org/10.4172/2375-4397.1000e111

    Article  Google Scholar 

  17. Shan CH, Guo XR, Ou J (2019) Deep leaky single-peaked triangle neural networks. Int J Control Autom Syst 17(10):2693–2701. https://doi.org/10.1007/s12555-018-0796-0

    Article  Google Scholar 

  18. Shan CH, Guo XR, Ou J (2019) Residual learning of deep convolutional neural networks for image denoising. J Intell Fuzzy Syst 37(2):2809–2818. https://doi.org/10.3233/JIFS-190017

    Article  Google Scholar 

  19. Nicholls JG, Martin AR, Fuchs PA, Brown DA, Diamond ME, Weisblat DA (2012) From neuron to brain, 5th ed. Sunderland Masinauer Associates 15–17

  20. Feali MS (2021) Using volatile/non-volatile memristor for emulating the short-and long-term adaptation behavior of the biological neurons. Neurocomputing 465:157–166. https://doi.org/10.1016/j.neucom.2021.08.132

    Article  Google Scholar 

  21. Zbek L, Temuin A, Cengiz B (2021) A study on the modeling and simulation for the motor unit action potential. Mosc Univ Comput Math Cybern 70(1):290–299. https://doi.org/10.31801/cfsuasmas.752697

    Article  MathSciNet  Google Scholar 

  22. Adrian ED (1946) The physical background of perception. Clarendon Press, Oxford

    Google Scholar 

  23. Rigoard P(2021) Atlas of anatomy of the peripheral nerves: The nerves of the limbscexpert edition. Springer Nature

  24. Gulcehre C, Moczulski M, Denil M, Bengio Y (2016) Noisy activation functions. In: Int Conf Mach Learn, 3059–3068

  25. LeCun Y, Bottou L (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324. https://doi.org/10.1109/5.726791

    Article  Google Scholar 

  26. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. Computer Science

  27. Chen Z, Jiang Y, Zhang X (2022) Resnet18dnn: prediction approach of drug-induced liver injury by deep neural network with resnet18. Brief Bioinform 23(1):503. https://doi.org/10.1093/bib/bbab503

    Article  Google Scholar 

  28. LeCun Y, Cortes C, Burges CJ(2010) Mnist handwritten digit database. AT &T Labs[Online] 2 http://yann.lecun.com/exdb/mnist

  29. Xiao H, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv: 1708–07747

  30. Netzer Y, Wang T, Coates A, Bissacco A, Wu B, Ng AY (2011) Reading digits in natural images with unsupervised feature learning. Advances in neural information processing systems. La Jolla, Neural Inf Proces Syst (NIPS), pp 4–12

  31. Borji A, Cheng MM, Jiang H, Li J (2015) Salient object detection: a benchmark. IEEE Trans Image Process 24(12):5706–5722. https://doi.org/10.1007/978-3-642-33709-3_30

    Article  MathSciNet  MATH  Google Scholar 

  32. Nilsback ME, Zisserman A (2008) Automated flower classification over a large number of classes. In: Sixth Indian Conference on Computer Vision, Graphics and Image Processing, ICVGIP 2008, Bhubaneswar, India, 16-19 December 2008. IEEE

Download references

Funding

This study was funded by the Anhui Polytechnic University Introduced Talent Research Startup Fund (No. 2020YQQ039) and the Pre-research Project of National Natural Science Foundation of Anhui Polytechnic University (No. Xjky2022046).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuanhui Shan.

Ethics declarations

Conflict of interest

Chuanhui Shan declares that he has no conflict of interest. Ao Li declares that he has no conflict of interest. Xiumei Chen declares that she has no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Data availability

MNIST dataset that supports the findings of this study is openly available in [GRAVITI] at [https://gas.graviti.cn/dataset/data-decorators/MNIST], reference number [28]. Fashion-MNIST dataset that supports the findings of this study is openly available in [GRAVITI] at [https://www.graviti.cn/open-datasets/FashionMNIST], reference number [29]. SVHN dataset that supports the findings of this study is openly available in [GRAVITI] at [https://www.graviti.cn/open-datasets/SVHN], reference number [30]. CALTECH101 dataset that supports the findings of this study is openly available in [GRAVITI] at [https://gas.graviti.cn/dataset/graviti-open-dataset/Caltech101], reference number [31]. 102 Category Flower (FLOWER102) dataset that supports the findings of this study is openly available in [GRAVITI] at [https://www.graviti.cn/open-datasets/Flower102], reference number [32].

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shan, C., Li, A. & Chen, X. Deep delay rectified neural networks. J Supercomput 79, 880–896 (2023). https://doi.org/10.1007/s11227-022-04704-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-022-04704-z

Keywords

Navigation