Skip to main content
Log in

Sensitive time series prediction using extreme learning machine

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Inspired by a multi-granularity and fractal theory, this work mainly focuses on how to conceive a training and test dataset at different levels under a small dataset in a complex real-time application. Such applications do not purely pursue most accurate values, but a low-cost(sub-optimal) solution may be popular during a timely prediction on those sensitive time series. Then a chaotic system is experimented and analysed in detail for three gap-sampling schemes, namely, microscope, middle scale and macro scope. At the same time, the influence of different activation functions on the accuracy and speed of their network model is discussed. The efficiency of sensitive time series using Extreme Learning Machine (ST-ELM) is examined on six widely used datasets (Abalone, Auto-MPG, Body fat, California Housing, Cloud and Strike). The simulations show that the suggested ST-ELM can improve the existing performance when dealing with the idle spectrum prediction of cognitive wireless network.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

References

  1. Shi ZW, Han M (2007) Support vector echo-state machine for chaotic time-series prediction. IEEE Trans Neural Netw 18(2):359–372

    Article  Google Scholar 

  2. Wong PK, Wong HC, Vong CM (2012) Online time-sequence incremental and decremental least squares support vector machines for engine air-ratio prediction. Int J Engine Res 13(1):28–40

    Article  Google Scholar 

  3. Lv Y, Duan Y, Kang W, Li Z, Wang FY (2015) Traffic flow prediction with big data: a deep learning approach. IEEE Trans Intell Transp Syst 16(2):865–873

    Google Scholar 

  4. Islam S, Talukdar B (2016) Performance Improvement of a Rainfall Prediction Model using Particle Swarm Optimization. Int J Comput Eng Res 6(7):39–42

    Google Scholar 

  5. Piotrowski AP, Napiorkowski JJ, Osuch M, Napiorkowski MJ (2016) On the importance of training methods and ensemble aggregation for run-off prediction by means of artificial neural networks. Hydrol Sci J 61(10):1903–1925

    Google Scholar 

  6. Li M-B, Huang G-B, Saratchandran P (2005) Fully complex extreme learning machine. Neurocomputing 68(10):306–314

    Article  Google Scholar 

  7. Feng G, Huang G-B, Lin Q (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357

    Article  Google Scholar 

  8. Rong H-J, Ong Y-S, Tan A-H (2008) A fast pruned-extreme learning machine for classification problem. Neurocomputing 72(1):359–366

    Article  Google Scholar 

  9. Miche Y, Sorjamaa A, Bas P (2010) OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21(1):158–162

    Article  Google Scholar 

  10. Lan Y, Soh YC, Huang G-B (2010) Constructive hidden nodes selection of extreme learning machine for regression. Neurocomputing 73(16):3191–3199

    Article  Google Scholar 

  11. Huang G-B, Chen L, Siew CK (2006) Universal approximation using incremental constructive feed forward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892

    Article  Google Scholar 

  12. Huang G, Liang N, Rong H, Saratchandran P, Sundararajan N (2005) On-line sequential extreme learning machine, the IASTED international conference on computational intelligence(CI2005),Canada, July, pp. 4-6

  13. Liang N, Huang G, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17:1411–1423

    Article  Google Scholar 

  14. Vong Chi Man, Tai Keng Iam, Pun Chi Man, Wong Pak Kin (2015) Fast and accurate face detection by sparse Bayesian extreme learning machine. Neural Comput Appl 26(5):1149–1156

    Article  Google Scholar 

  15. Arrachman1 Samudra R, Adiatmoko MF, Soeprijanto Adi, Syai’in Mat, Sidik MSA, Rohiem NH (2017) Smart Meter based on Time Series Modify and Extreme Learning Machine. In: 2017 2nd international conference on automation, cognitive science, optics, micro electro-mechanical system, and information technology (ICACOMIT), October 23, 2017, Jakarta, Indonesia, pp 86-92

  16. Vong Chi-Man, Ip Weng-Fai, Chiu Chi-Chong, Wong Pak-Kin (2015) Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput 7(3):381–391

    Article  Google Scholar 

  17. Jie Du, Vong Chi-Man, Pun Chi-Man, Wong Pak-Kin, Ip Weng-Fai (2017) Post-boosting of classification boundary for imbalanced data using geometric mean. Neural Netw 96(12):101–114

    Google Scholar 

  18. Wong Pak Kin, Gao Xiang Hui, Wong Ka In, ManVong Chi (2018) Efficient point-by-point engine calibration using machine learning and sequential design of experiment strategies. J Frankl Inst 355(4):1517–1538

    Article  MathSciNet  MATH  Google Scholar 

  19. Wong Pak Kin, Gao Xiang Hui, Wong Ka In, ManVong Chi (2018) Online extreme learning machine based modeling and optimization for point-by-point engine calibration. Neurocomputing 277(2):187–197

    Article  Google Scholar 

  20. Vong C, Du J, Wong C, Cao J, Postboosting Using Extended G-Mean for Online Sequential Multiclass Imbalance Learning. In: IEEE transactions on neural networks and learning systems,(Early Access). https://doi.org/10.1109/TNNLS.2018.2826553.

    Article  Google Scholar 

  21. Park J-M, Kim J-H (2017) Online recurrent extreme learning machine and its application to time-series prediction. In: 2017 International joint conference on neural networks (IJCNN), Anchorage, AK,pp.1983-1990

  22. Boeing G (2016) Visual analysis of nonlinear dynamical systems: chaos, fractals, self-similarity and the limits of prediction. Systems 4(4):37

    Article  Google Scholar 

  23. Shon SH, Jang SJ, Kim JM (2010) HMM-based adaptive frequency-hopping cognitive radio system to reduce interference time and to improve throughput. KSII Trans Internet Inf Syst 4(4):475–490

    Google Scholar 

  24. Bütün İ, Talay A ç, Altilar DT, Khalid M, Sankar R (2010) Impact of mobility prediction on the performance of Cognitive Radio networks. In: 2010 wireless telecommunications symposium (WTS), Tampa, FL, 2010, pp. 1-5

  25. Wen Z, Luo T, Xiang W (2008) Autoregressive spectrum hole prediction model for cognitive radio systems, ICC Workshops - 2008 IEEE International Conference on Communications Workshops, Beijing, 2008, pp.154-157

  26. Geirhofer S, Tong L, Sadler BM (2008) Cognitive medium access: constraining interference based on experimental models. IEEE J Sel Areas Commun 26(1):475–490

    Article  Google Scholar 

  27. Wang X, Han M (2015) Improved extreme learning machine for multivariate time series online sequential prediction. Eng Appl Artif Intell 40(4):28–36

    Article  Google Scholar 

  28. Zhang R, Xu M, Han M, et al. (2017) Multivariate chaotic time series prediction based on improved extreme learning machine. In: 36th Chinese control conference (CCC), Dalian, 2017, pp 4006-4011

  29. Chen Chuangquan, Vong Chi-Man, Wong Chi-Man, Wang Weiru, Wong Pak-Kin (2018) Efficient extreme learning machine via very sparse random projection. Soft Comput 22(11):3563–3574

    Article  MATH  Google Scholar 

  30. Lian C, Zeng ZG, Yao W, Tang HM (2013) Ensemble of extreme learning machine for land slide displacement prediction based on time series analysis. Neural Comput Appl 24(1):99–107

    Article  Google Scholar 

  31. Salcedo-Sanz S, Casanova-Mateo C, Pastor-Snchez A, Snchez Girn M (2014) Daily global solar radiation prediction based on a hybrid coral reefs optimization extreme learning machine approach. Solar Energy 105(2014):91–98

    Article  Google Scholar 

  32. Fernndez-Delgado M, Cernadas E, Barro S, Ribeiro J, Neves J (2014) Direct Kernel Perceptron (DKP): ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation. Neural Netw 50(2):60–71

    Article  MATH  Google Scholar 

  33. Wong CM, Vong CM, Wong PK, Cao J (2018) Kernel-based multilayer extreme learning machines for representation learning. IEEE Trans Neural Netw Learn Syst 29(3):757–762

    Article  MathSciNet  Google Scholar 

  34. Vong Chi-Man, Chen Chuangquan, Wong Pak-Kin (2018) Empirical kernel map-based multilayer extreme learning machines for representation learning. Neurocomputing 318(10):265–276

    Article  Google Scholar 

  35. Canzian Luca, Zhang Yu, Schaar Mihaela vander (2015) Ensemble of distributed learners for online classification of dynamic data streams. IEEE Trans Signal Inf Process Over Netw 1(3):180–194

    Article  MathSciNet  Google Scholar 

  36. Guo X, Pang Y, Yan G, et al. (2017) Time series forecasting based on deep extreme learning machine. In: 29th Chinese control and decision conference (CCDC), Chongqing, 2017, pp 6151-6156

  37. Adhikari R, Agrawal RK (2013) A homogeneous ensemble of artificial neural networks for time series forecasting. Int J Comput Appl 32(7):1–8

    Google Scholar 

  38. Bodnar Olha, Schmid Wolfgang (2017) CUSUM control schemes for monitoring the covariance matrix of multivariate time series. Statistics 51(4):722–744

    Article  MathSciNet  MATH  Google Scholar 

  39. Lin L, Wang F, Xie X (2017) Random forests-based extreme learning machine ensemble for multi-regime time series prediction. Expert Syst Appl 83(C):164–176

    Article  Google Scholar 

  40. Ding S, Zhao H, Zhang Y, Xu X, Nie R (2015) Extreme learning machine: algorithm, theory and applications. Artif Intell Rev 44(1):103–115

    Article  Google Scholar 

  41. Huang G-B, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multi-class classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 42(2):513–529

    Article  Google Scholar 

  42. Heidari-Bateni G, McGillem CD (1994) A chaotic direct-sequence spread-spectrum communication system. IEEE Trans Commun 42(234):1524–1527

    Article  Google Scholar 

  43. Sedaghatnejad S, Farhang M (2015) Detectability of chaotic direct-sequence spread-spectrum signals. IEEE Wirel Commun Lett 4(6):589–592

    Article  Google Scholar 

  44. Zhao Pengfei, Xing Lei, Jun Yu (2009) Chaotic time series prediction: from one to another. Phys Lett A 373(25):2174–2177

    Article  MATH  Google Scholar 

  45. Kasun LLC, Zhou H, Huang G-B, Vong CM (2013) Representational learning with extreme learning machine for big data. IEEE Intell Syst 28(6):31–34

    Google Scholar 

  46. Tang J, Deng C, Huang G-B (2016) Extreme learning machine for multilayer perceptron, IEEE Trans. Neural Netw Learn Syst 27(4):809–821

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The Project Supported in part by the National Natural Science Foundation of China (No.61572074), Ladder Plan Project of Beijing Key Lab (No.Z121101002812005), and China Scholarship Council for visiting to UK (No.201706465028).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong-Bo Wang.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, HB., Liu, X., Song, P. et al. Sensitive time series prediction using extreme learning machine. Int. J. Mach. Learn. & Cyber. 10, 3371–3386 (2019). https://doi.org/10.1007/s13042-019-00924-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-019-00924-7

Keywords

Navigation