Skip to main content
Log in

The extreme learning machine learning algorithm with tunable activation function

  • Extreme Learning Machine’s Theory & Application
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose an extreme learning machine (ELM) with tunable activation function (TAF-ELM) learning algorithm, which determines its activation functions dynamically by means of the differential evolution algorithm based on the input data. The main objective is to overcome the problem dependence of fixed slop of the activation function in ELM. We mainly considered the issue of processing of benchmark problems on function approximation and pattern classification. Compared with ELM and E-ELM learning algorithms with the same network size or compact network configuration, the proposed algorithm has improved generalization performance with good accuracy. In addition, the proposed algorithm also has very good performance in the TAF neural networks learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489-501

    Article  Google Scholar 

  2. Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122

    Article  Google Scholar 

  3. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the international joint conference on neural networks (IJCNN2004), Budapest, Hungary, pp 25–29

  4. Li MB, Er MJ (2006) Nonlinear system identification using extreme learning machine. In: 9th International conference on control, automation, robotics and vision, pp 1–4

  5. Li FC, Wang PK, Wang GE (2009) Comparison of the primitive classifiers with extreme learning machine in credit scoring. In: IEEE international conference on industrial engineering and engineering management, Hong Kong, pp 685–688

  6. Li B, Li YB (2011) Chaotic time series prediction based on ELM learning algorithm. J Tianjing Unv 44(8):701–704

    Google Scholar 

  7. Han F, Huang DS (2006) Improved extreme learning machine for function approximation by encoding a priori information. Neurocomputing 69(16–18):2369–2373

    Article  Google Scholar 

  8. Harpham C, Dawson CW (2006) The effect of different basis functions on a radial basis function network for time series prediction: a comparative study. Neurocomputing 69(16–18):2161–2170

    Article  Google Scholar 

  9. Wu YS, Zhao MS (2001) A neuron model with trainable activation function (TAF) and its MFNN supervised learning. Sci China (Ser F) 44(5):366–375

    Google Scholar 

  10. Shen YJ, Wang BW (2004) A fast learning algorithm of neural network with tunable activation function. Sci China (Ser F) 47(1):126–136

    Article  Google Scholar 

  11. Shen YJ, Wang BW, Chen FG, Cheng L (2004) A new multi-output nueral model with tunable activation function and its applications. Neural Process Lett 20:85–104

    Article  Google Scholar 

  12. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38:1759–1763

    Article  MATH  Google Scholar 

  13. Hornik K (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4(2):251–257

    Article  Google Scholar 

  14. Huang GB, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9(1):224–229

    Article  Google Scholar 

  15. Huang GB (2003) Learning capability and storage capacity of two hidden layer feedforward networks. IEEE Trans Neural Netw 14(2):274–281

    Article  Google Scholar 

  16. Li B, Li YB, Rong XW (2010) Intelligent optimization strategy for ELM-RBF neural networks. J Shandong Univ (Nat Sci) 45(5):48–52

    Google Scholar 

  17. Huang GB, Siew CK (2005) Extreme learning machine with randomly assigned RBF kernels. Int J Inf Technol 11(1):16–24

    Google Scholar 

  18. Murphy PM, Aha DW (2008) UCI repository of machine learning databases [online]. Availabe: http://archive.ics.uci.edu/ml/datasets.html

  19. Redondo MF, Espinosa CH (1999) Generalization capability of one and two hidden layers. In: International joint conference on neural networks, Washington DC, vol 3, pp 1840–1843

  20. Wei HK, Xu SX,Song WZ (2001) Generalization theory and generalizaiton methods for neural networks. Acta Automat Sin 27(6):806–815

    MathSciNet  Google Scholar 

Download references

Acknowledgments

This work was supported by the Independent Innovation Foundation of Shandong University Grant No. 2009JC010 and 2011JC011, the National Nature Science Foundation of China Grant No. 61075091, National Natural Science Foundation for Young Scholars of China (61105100) and the Natural Science Foundation of Shandong Province under grant No. Y2008G21 and Grant No. 2007BS01008.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yibin Li.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Li, B., Li, Y. & Rong, X. The extreme learning machine learning algorithm with tunable activation function. Neural Comput & Applic 22, 531–539 (2013). https://doi.org/10.1007/s00521-012-0858-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-0858-9

Keywords

Navigation