Skip to main content

Single Layer Feedforward Networks Construction Based on Orthogonal Least Square and Particle Swarm Optimization

  • Conference paper
  • First Online:
Book cover Artificial Intelligence and Soft Computing (ICAISC 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9692))

Included in the following conference series:

Abstract

According to the simplicity and universal approximation capability, single layer feedforward networks (SLFN) are widely used in classification and regression problems. The paper presents a new OLS-PSO constructive algorithm based on Orthogonal Least Square (OLS) method and Particle Swarm Optimization (PSO) algorithm. Instead of evaluating the orthogonal components of each neuron as the conventional OLS method, a new recursive formulation is derived. Then based on the new evaluation of each neuron’s contribution, the PSO algorithm is used to seek the optimal parameters of the new neuron in continuous space. The proposed algorithm is experimented on some practical regression problems and compared with other constructive algorithms. Results show that proposed OLS-PSO algorithm could achieve a compact SLFN with good generalization ability.

This work was partially supported by the National Science Centre, Cracow, Poland under Grant No. 2013/11/B/ST6/01337.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Netw. 4, 251–257 (1991)

    Article  Google Scholar 

  2. Lin, B., Lin, B., Chong, F., Lai, F.: Higher-order-statistics-based radial basis function networks for signal enhancement. IEEE Trans. Neural Netw. 18(3), 823–832 (2007)

    Article  Google Scholar 

  3. Min, C.C., Srinivasan, D., Cheu, R.: Neural networks for continuous online learning and control. IEEE Trans. Neural Netw. 17(5), 1511–1531 (2006)

    Google Scholar 

  4. Kwok, T.Y., Yeung, D.Y.: Objective functions for training new hidden units in constructive neural networks. IEEE Trans. Neural Netw. 8(5), 1131–1148 (1997)

    Article  Google Scholar 

  5. Kwok, T.Y., Yeung, D.Y.: Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Trans. Neural Netw. 8(3), 630–645 (1997)

    Article  Google Scholar 

  6. Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 International Joint Conference on Neural Networks (IJCNN-2004), (Budapest, Hungary), July 25–29 (2004)

    Google Scholar 

  7. Huang, G.-B., Chen, L.: Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18), 3460–3468 (2008)

    Article  Google Scholar 

  8. Feng, G., Huang, G.B., Lin, Q., Gay, R.: Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans. Neural Netw. 20(8), 1352–1357 (2009)

    Article  Google Scholar 

  9. Chen, S., Billings, S.A., Luo, W.: Orthogonal least squares methods and their applications to non-linear system identification. Int. J. Control 50, 1873–1896 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  10. Chen, S., Hong, X., Luk, B.L., Harris, C.J.: Construction of tunable radial basis function networks using orthogonal forward selection. IEEE Trans. Syst. Man Cybern. B 39(2), 457–466 (2009)

    Article  Google Scholar 

  11. Zhang, L., Li, K., He, H., Irwin, G.W.: A new discrete-continuous algorithm for radial basis function networks construction. IEEE Trans. Neural Netw. Learn. Syst. 24(11), 1785–1798 (2013)

    Article  Google Scholar 

  12. Rao, C.R., Mitra, S.K.: Generalized Inverse of Matrices and Its Applications. Wiley, New York (1971)

    MATH  Google Scholar 

  13. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Network, Perth, Australia, pp. 1942–1948 (1995)

    Google Scholar 

  14. Langdon, W.B., Poli, R.: Evolving problems to learn about particle swarm optimizers and other search algorithms. IEEE Trans. Evol. Comput. 11(5), 561–578 (2007)

    Article  Google Scholar 

  15. Zhang, C., Shao, H., Li, Y.: Particle swam optimisation for evolving artificial neural network. In: Proceedings of the IEEE Intemational Conference on Systems, Man, and Cybernetics, pp. 2487–2490 (2000)

    Google Scholar 

  16. Han, F., Yao, H.-F., Ling, Q.-H.: An improved extreme learning machine based on particle swarm optimization. In: Huang, D.-S., Gan, Y., Premaratne, P., Han, K. (eds.) ICIC 2011. LNCS, vol. 6840, pp. 699–704. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  17. Mendes, R., Cortez, P., Rocha, M., Neves, J.: Particle swarms for feedforward neural network training. In: Proceedings International Joint Conference on Neural Networks, pp. 1895–1899 (2002)

    Google Scholar 

  18. Shi, Y., Eberhart, R.C.: A modied particle swarm optimizer. In: Proceedings of IEEE World Conference on Computation Intelligence, pp. 69–73 (1998)

    Google Scholar 

  19. Bartlett, P.L.: The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans. Inf. Theory 44, 525–536 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  20. Wilamowski, B.M., Yu, H.: Improved computation for Levenberg Marquardt training. IEEE Trans. Neural Netw. 21(6), 930–937 (2010)

    Article  Google Scholar 

  21. Xie, T., Yu, H., Hewlett, J., Rozycki, P., Wilamowski, B.: Fast and efficient second order method for training radial basis function networks. IEEE Trans. Neural Netw. 24(4), 609–619 (2012)

    Google Scholar 

  22. Shi, Y., Eberhart, R.C.: Empirical study of particle swarm optimization. In: Proceeding of the 1999 Congress on Evolutionary Computation, Piscataway, pp. 1945–1950 (1999)

    Google Scholar 

  23. Fahlman, S.E.: Faster learning variations on backpropagation: an empirical study. In: Proceedings of 1988 Connectionist Models Summer School, pp. 38–51 (1988)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pawel Rozycki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Wu, X., Rozycki, P., Wilamowski, B.M. (2016). Single Layer Feedforward Networks Construction Based on Orthogonal Least Square and Particle Swarm Optimization. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2016. Lecture Notes in Computer Science(), vol 9692. Springer, Cham. https://doi.org/10.1007/978-3-319-39378-0_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-39378-0_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-39377-3

  • Online ISBN: 978-3-319-39378-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics