Skip to main content

An Improved Double Hidden-Layer Variable Length Incremental Extreme Learning Machine Based on Particle Swarm Optimization

  • Conference paper
  • First Online:
Intelligent Computing Theories and Application (ICIC 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10955))

Included in the following conference series:

Abstract

Extreme learning machine (ELM) has been widely used in diverse domains. With the development of deep learning, integrating ELM with some deep learning method has become a new perspective method for extracting and classifications. However, it may require a large number of hidden nodes and lead to the ill-condition problem for its random generation. In this paper, an effective hybrid approach based on Variable-length Incremental ELM and Particle Swarm Optimization (PSO) algorithm (PSO-VIELM) is proposed which can be used to regulate weights and extract features. In the new method, we build two hidden layers to establish a structure which is compact with a better generalization performance. In the first hidden layer named extraction layer, we make the feature learning to the raw data, and make dynamic updates for hidden layer nodes, and use the fitting error as the fitness function to update the weights corresponding to the hidden nodes with the method of PSO. In the second hidden layer named classification layer, we make a classification for the processed data from extraction layer and use cross-entropy as the fitness function to update the weights in the net. In order to find the appropriate number of hidden layer nodes, all hidden nodes will no longer grow in the case of a rebound in the fitness function on the validation set. The result in some datasets shows that PSO-VIELM has a better generalization performance than other constructive ELMs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., Lendasse, A.: Brief papers OP-ELM: optimally pruned extreme learning machine. IEEE Trans. Neural Netw. 21(1), 158–162 (2010)

    Article  Google Scholar 

  2. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: IEEE International Joint Conference on Neural Networks, vol. 2, pp. 985–990 (2004)

    Google Scholar 

  3. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  4. Huang, G.B., Siew, C.K.: Extreme learning machine: RBF network case. In: Control, Automation, Robotics and Vision Conference, pp. 1029–1036 (2004)

    Google Scholar 

  5. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879 (2006)

    Article  Google Scholar 

  6. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948 (2002)

    Google Scholar 

  7. Xu, Y., Shu, Y.: Evolutionary extreme learning machine – based on particle swarm optimization. In: Wang, J., Yi, Z., Zurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 644–652. Springer, Heidelberg (2006). https://doi.org/10.1007/11759966_95

    Chapter  Google Scholar 

  8. Zhao, G., Shen, Z., Miao, C., Gay, R.: Enhanced extreme learning machine with stacked generalization. In: IEEE International Joint Conference on Neural Networks, pp. 1191–1198 (2008)

    Google Scholar 

  9. Zhang, R., Lan, Y., Huang, G.B., Xu, Z.B.: Universal approximation of extreme learning machine with adaptive growth of hidden nodes. IEEE Trans. Neural Netw. Learn. Syst. 23(2), 365 (2012)

    Article  Google Scholar 

  10. Yang, Y., Wang, Y., Yuan, X.: Parallel chaos search based incremental extreme learning machine. Neural Process. Lett. 37(3), 277–301 (2013)

    Article  Google Scholar 

  11. Bastien, F., Lamblin, P., Pascanu, R., Bergstra, J., Goodfellow, I., Bergeron, A., Bouchard, N., Wardefarley, D., Bengio, Y.: Theano: new features and speed improvements. Computer Science (2012)

    Google Scholar 

  12. Sun, K., Zhang, J., Zhang, C., Hu, J.: Generalized extreme learning machine autoencoder and a new deep neural network. Neurocomputing 230, 374–381 (2016)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China [Nos. 61572241 and 61271385], the National Key R&D Program of China [No. 2017YFC0806600], the Foundation of the Peak of Six Talents of Jiangsu Province [No. 2015-DZXX-024], the Fifth 333 High Level Talented Person Cultivating Project of Jiangsu Province [No. (2016) III-0845], and the Research Innovation Program for College Graduates of Jiangsu Province [1291170030].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fei Han .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, Q., Han, F., Ling, Q. (2018). An Improved Double Hidden-Layer Variable Length Incremental Extreme Learning Machine Based on Particle Swarm Optimization. In: Huang, DS., Jo, KH., Zhang, XL. (eds) Intelligent Computing Theories and Application. ICIC 2018. Lecture Notes in Computer Science(), vol 10955. Springer, Cham. https://doi.org/10.1007/978-3-319-95933-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-95933-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-95932-0

  • Online ISBN: 978-3-319-95933-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics