Skip to main content
Log in

Stochastic configuration broad learning system and its approximation capability analysis

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In this paper, a kind of stochastic configuration broad learning system (SCBLS) is proposed for data modeling. The proposed SCBLS is established in the form of a flat network and its architecture is determined by a constructive learning approach. The input parameters of feature nodes and enhancement nodes of SCBLS are randomly assigned in the light of a supervisory mechanism. Inequality constraints are used to randomly assign the hidden parameters and adaptively select the scopes of random parameters. The output parameters of SCBLS are determined either by a constructive manner or by solving a global least squares problem. It is proved that the proposed SCBLS possesses universal approximation properties. The performances of the proposed SCBLS are evaluated by function approximation, benchmark datasets and time series prediction. Numerical examples show that SCBLS can achieve satisfactory approximation accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Availability of data and material

All data used during this study are available from the corresponding author on reasonable request.

Code availability

The code used during the current study are available from the corresponding author on reasonable request

Notes

  1. KEEL: http://www.keel.es/

  2. TSDL: https:/ /datamarket.com /data /list /?q = provider:tsdl.

References

  1. Chen J (2012) Structural vibration suppression by using neural classifier with genetic algorithm. Int J Mach Learn Cybernet 3(3):215–221

    Article  Google Scholar 

  2. Barakat M, Lefebvre M, Khalil M et al (2013) Parameter selection algorithm with self-adaptive growing neural network classifier for diagnosis issues. Int J Mach Learn Cybernet 4(3):217–233

    Article  Google Scholar 

  3. Rech PC (2015) Period-adding and spiral organization of the periodicity in a Hopfield neural network. Int J Mach Learn Cybernet 6(1):1–6

    Article  Google Scholar 

  4. He Q, Shang TF, Zhuang FZ et al (2013) Parallel extreme learning machine for regression based on MapReduce. Neurocomputing 102:52–58

    Article  Google Scholar 

  5. Wang DG, Song WY, Pedrycz W et al (2021) An integrated neural network with nonlinear output structure for interval-valued data. J Intell Fuzzy Syst 40(1):673–683

    Article  Google Scholar 

  6. Wang DG, Song WY, Pedrycz W (2018) A two stage forecasting approach for interval-valued time series. J Intell Fuzzy Syst 35(2):2501–2512

    Article  Google Scholar 

  7. Zhang XY, Wang DG, Ota K et al (2020) Exponential stability of mixed time-delay neural networks based on switching approaches. IEEE Trans Cybernet. https://doi.org/10.1109/TCYB.2020.2985777

    Article  Google Scholar 

  8. He Q, Jin X, Du CY et al (2014) Clustering in extreme learning machine feature space. Neurocomputing 128:88–95

    Article  Google Scholar 

  9. Min F, Zhang S, Ciucci D et al (2020) Three-way active learning through clustering selection. Int J Mach Learn Cybernet 11:1033–1046

    Article  Google Scholar 

  10. Zhang YX, Huang DG, Lin HM et al (2020) Knowledge reasoning approach with linguistic-valued intuitionistic fuzzy credibility. Int J Mach Learn Cybernet 11:169–184

    Article  Google Scholar 

  11. Zou L, Wen X, Wang YX (2016) Linguistic truth-valued intuitionistic fuzzy reasoning with applications in human factors engineering. Inf Sci 327:201–216

    Article  MathSciNet  Google Scholar 

  12. Liu X, Wang Y, Li XN et al (2017) A linguistic-valued approximate reasoning approach for financial decision making. Int J Comput Intell syst 10:312–319

    Article  Google Scholar 

  13. Igelnik B, Pao YH (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6(6):1320–1329

    Article  Google Scholar 

  14. Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector Functional-link net. Neurocomputing 6(2):163–180

    Article  Google Scholar 

  15. Chen CLP, Liu Z (2018) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24

    Article  MathSciNet  Google Scholar 

  16. Chen CLP, Liu Z, Feng S (2019) Universal approximation capability of broad learning system and its structural variations. IEEE Trans Neural Netw Learn Syst 30(4):1191–1204

    Article  MathSciNet  Google Scholar 

  17. Feng S, Chen CLP (2020) Fuzzy broad learning system: a novel neuro-fuzzy model for regression and classification. IEEE Trans Cybernet 50(2):414–424

    Article  Google Scholar 

  18. Han M, Feng S, Chen CLP (2019) Structured manifold broad learning system: a manifold perspective for large-scale chaotic time series analysis and prediction. IEEE Trans Knowl Data Eng 31(9):1809–1821

    Article  Google Scholar 

  19. Xu M, Han M, Chen CLP (2020) Recurrent broad learning systems for time series prediction. IEEE Trans Cybernet 50(4):1405–1417

    Article  Google Scholar 

  20. Liu Z, Chen CLP, Feng S et al (2021) Stacked broad learning system: from incremental flatted structure to deep model. IEEE Trans Syst Man Cybern 51(1):209–222

    Article  Google Scholar 

  21. Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892

    Article  Google Scholar 

  22. Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468

    Article  Google Scholar 

  23. Qiao J, Li F, Han H et al (2016) Constructive algorithm for fully connected cascade feedforward neural networks. Neurocomputing 182(19):154–164

    Article  Google Scholar 

  24. Li M, Wang D (2017) Insights into randomized algorithms for neural networks: Practical issues and common pitfalls. Inf Sci 382–383:170–178

    Article  Google Scholar 

  25. Gorban AN, Tyukin IY, Prokhorov DV et al (2016) Approximation with random bases: Pro et Contra. Inf Sci 364–365:129–145

    Article  Google Scholar 

  26. Wang D, Li M (2017) Stochastic configuration networks: fundamentals and algorithms. IEEE Trans Cybern 47(10):3466–3479

    Article  Google Scholar 

  27. Wang D, Li M (2017) Robust stochastic configuration networks with kernel density estimation for uncertain data regression. Inf Sci 412–413:210–222

    Article  MathSciNet  Google Scholar 

  28. Pratama M, Wang D (2019) Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams. Inf Sci 495:150–174

    Article  MathSciNet  Google Scholar 

  29. Wang Q, Dai W, Ma X et al (2020) Driving amount based stochastic configuration network for industrial process modeling. Neurocomputing 394:61–69

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation (NNSF) of China under Grant (61773088, 12071056), the Fundamental Research Funds for the Central Universities (DUT20JC30) and the National Key R&D Program of China (2018AAA0100300).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Degang Wang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, W., Wang, D., Li, H. et al. Stochastic configuration broad learning system and its approximation capability analysis. Int. J. Mach. Learn. & Cyber. 13, 797–810 (2022). https://doi.org/10.1007/s13042-021-01341-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-021-01341-5

Keywords

Navigation