Skip to main content
Log in

Cycle sampling neural network algorithms and applications

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Two improved sampling neural network (SNN) algorithms, Cycle SNN (CSNN) and Rolling-Cycle SNN (RSNN), are proposed and optimized in this study, to improve the accuracy of basic SNN (BSNN). Experiments show that the improved algorithms achieve significant improvements in both accuracy and training efficiency. This study also perfects the SNN theoretical system and unifies the vector form of these SNN algorithms. Based on the theoretical analysis, this can be achieved by effectively reducing the high-frequency component and aliasing distortion through cycle extension and rolling. These efforts have made useful contributions to explore the potential and prospects of SNN applications. The SNN networks with a new structure and SNN error diffusion (SNN-ED) convergence method provide a new idea for the development of neural networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data Availability

The data of NTC thermo-sensitive semiconductor resistors were derived from the following resources available in the public domain: https://www.sinochip.net/ [45].

References

  1. Sani S, Shermeh HE (2022) A novel algorithm for detection of COVID-19 by analysis of chest CT images using Hopfield neural network. Expert Syst Appl 197:116740. https://doi.org/10.1016/j.eswa.2022.116740

    Article  Google Scholar 

  2. Bilal DK, Unel M, Tunc LT et al (2022) Development of a vision based pose estimation system for robotic machining and improving its accuracy using LSTM neural networks and sparse regression. Robot Computer-Integrated Manufact 74:456. https://doi.org/10.1016/j.rcim.2021.102262

    Article  Google Scholar 

  3. Emami SA, Castaldi P, Banazadeh A (2022) Neural network-based flight control systems: present and future. Annu Rev Control 53:97–137. https://doi.org/10.1016/j.arcontrol.2022.04.006

    Article  MathSciNet  Google Scholar 

  4. Larestani A, Mousavi SP, Hadavimoghaddam F et al (2022) Predicting the surfactant-polymer flooding performance in chemical enhanced oil recovery: cascade neural network and gradient boosting decision tree. Alex Eng J 61:7715–7731. https://doi.org/10.1016/j.aej.2022.01.023

    Article  Google Scholar 

  5. Zhang L, Zhu L, Hua C et al (2021) Adaptive neural network control for a class of interconnected pure-feedback time-delay nonlinear systems with full-state constraints and unknown measurement sensitivities. Neurocomputing 461:147–161. https://doi.org/10.1016/j.neucom.2021.07.043

    Article  Google Scholar 

  6. Yang F, Zhang H, Tao S (2021) Travel order quantity prediction via attention-based bidirectional LSTM networks. J Supercomput 78:4398–4420. https://doi.org/10.1007/s11227-021-04032-8

    Article  Google Scholar 

  7. Cai G, Wu L, Li M (2021) The circuit fault diagnosis method based on spectrum analyses and ELM. In: 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), pp 475–479

  8. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533–536

    Article  MATH  Google Scholar 

  9. Zhang R, Huang G-B, Sundararajan N et al (2007) Improved GAP-RBF network for classification problems. Neurocomputing 70:3011–3018. https://doi.org/10.1016/j.neucom.2006.07.016

    Article  Google Scholar 

  10. Zhou Y, Li C, Wang H (2018) Stability analysis on state-dependent impulsive Hopfield neural networks via fixed-time impulsive comparison system method. Neurocomputing 316:20–29. https://doi.org/10.1016/j.neucom.2018.07.047

    Article  Google Scholar 

  11. Huang G-B, Wang D (2011) Advances in extreme learning machines (ELM2010). Neurocomputing 74:2411–2412. https://doi.org/10.1016/j.neucom.2011.03.030

    Article  Google Scholar 

  12. Chi Z, Yuan G, Ming L (2021) A review of development and application of artificial neural network models. Comput Eng Appl 56:1–15. https://doi.org/10.3778/j.issn.1002-8331.2102-0256

    Article  Google Scholar 

  13. Li X, Wang L, Sung E (2008) AdaBoost with SVM-based component classifiers. Eng Appl Artif Intell 21:785–795. https://doi.org/10.1016/j.engappai.2007.07.001

    Article  Google Scholar 

  14. Huang GB, Zhou H, Ding X et al (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B Cybern 42:513–529. https://doi.org/10.1109/TSMCB.2011.2168604

    Article  Google Scholar 

  15. Rathod N, Wankhade S (2022) Optimizing neural network based on cuckoo search and invasive weed optimization using extreme learning machine approach. Neuroscience Informatics. https://doi.org/10.1016/j.neuri.2022.100075

    Article  Google Scholar 

  16. Rong H-J, Ong Y-S, Tan A-H et al (2008) A fast pruned-extreme learning machine for classification problem. Neurocomputing 72:359–366. https://doi.org/10.1016/j.neucom.2008.01.005

    Article  Google Scholar 

  17. Chen Z, Yang C, Qiao J (2021) The optimal design and application of LSTM neural network based on the hybrid coding PSO algorithm. J Supercomput 78:7227–7259. https://doi.org/10.1007/s11227-021-04142-3

    Article  Google Scholar 

  18. Lindemann B, Maschler B, Sahlab N et al (2021) A survey on anomaly detection for technical systems using LSTM networks. Comput Industry 131:103498. https://doi.org/10.1016/j.compind.2021.103498

    Article  Google Scholar 

  19. Etxegarai G, López A, Aginako N et al (2022) An analysis of different deep learning neural networks for intra-hour solar irradiation forecasting to compute solar photovoltaic generators’ energy production. Energy Sustain Dev 68:1–17. https://doi.org/10.1016/j.esd.2022.02.002

    Article  Google Scholar 

  20. Wambugu N, Chen Y, Xiao Z et al (2021) Hyperspectral image classification on insufficient-sample and feature learning using deep neural networks: a review. Int J Appl Earth Observ Geoinform 105:102603. https://doi.org/10.1016/j.jag.2021.102603

    Article  Google Scholar 

  21. Hadsell R, Rao D, Rusu AA et al (2020) Embracing change: continual learning in deep neural networks. Trends Cogn Sci 24:1028–1040. https://doi.org/10.1016/j.tics.2020.09.004

    Article  Google Scholar 

  22. Bouwmans T, Javed S, Sultana M et al (2019) Deep neural network concepts for background subtraction: a systematic review and comparative evaluation. Neural Netw 117:8–66. https://doi.org/10.1016/j.neunet.2019.04.024

    Article  Google Scholar 

  23. Habib G, Qureshi S (2020) Optimization and acceleration of convolutional neural networks: a survey. J King Saud Univ Comput Inform Sci. https://doi.org/10.1016/j.jksuci.2020.10.004

    Article  Google Scholar 

  24. Wang H, Wang Y, Wang X et al (2022) A novel deep-learning model for RDTS signal denoising based on down-sampling and convolutional neural network. J Lightwave Technol 40:3647–3653. https://doi.org/10.1109/jlt.2022.3149400

    Article  Google Scholar 

  25. Liu X, Qi D-Y, Li W-L et al (2021) Exploring the Internet of Things sequence-structure detection and supertask network generation of temporal-spatial-based graph convolutional neural network. J Supercomput 78:5029–5049. https://doi.org/10.1007/s11227-021-04041-7

    Article  Google Scholar 

  26. Cai G, Wu L (2021) Sampling neural network: a novel neural network based on sampling theorem. In: 2021 6th International Symposium on Computer and Information Processing Technology (ISCIPT). p 717–720.

  27. Shannon CE (1949) Communication in the presence of noise. Proc IRE 37:10–21

    Article  MathSciNet  Google Scholar 

  28. Huo H, Sun W (2015) Average sampling theorem. Sci Sinica Math 45:1403–1422. https://doi.org/10.1360/n012015-00026

    Article  MATH  Google Scholar 

  29. Zhang J, Wang G, Wang H (2017) Improvement of wind-induced vibration analysis in time domain based on Shannon sampling theorem. Noise Vib Control 3:131–136. https://doi.org/10.3969/j.issn.1006-1355.2017.02.027

    Article  Google Scholar 

  30. JL Zheng, W.L.Y., Y. Ying Q, 2009 Introduction to signals and systems. Higher Education Press, Beijing

  31. Av O, As W, Sh N (2015) Signals and systems(Second Edition). Publishing House of Electronics Industry, Beijing

    Google Scholar 

  32. Cheng PQ (2013) Digital signal processing tutorial. Tsinghua University Press, Beijing

    Google Scholar 

  33. Luo X, Zhang Z (2021) Data recovery with sub-Nyquist sampling: fundamental limit and a detection algorithm. Front Inform Technol Electron Eng 22:232–243. https://doi.org/10.1631/fitee.1900320

    Article  Google Scholar 

  34. Butzer PL, Engels W, Scheben U (1982) Magnitude of the truncation error in sampling expansions of bandlimited signals. IEEE Trans Acoust Speech Signal Process 30:906–912

    Article  MATH  Google Scholar 

  35. Jagerman D (1966) Bounds for truncation error of the sampling expansion. SIAM J Appl Math 14:714–723

    Article  MATH  MathSciNet  Google Scholar 

  36. Fj B (1976) On the truncation error of the cardinal sampling expansion. IEEE Trans Inform Theory 22(5):568–73

    Article  MathSciNet  Google Scholar 

  37. Cambanis S, Masry E (1982) Truncation error bounds for the cardinal sampling expansion of bandlimited signals. IEEE Trans Inform Theory 28:605–612

    Article  MATH  MathSciNet  Google Scholar 

  38. Y Hu (2008) Truncation error bounds for the cardinal sampling expansion of bandlimited signals. J Graduate School Chin Acad Sci 25:460–466

    Google Scholar 

  39. Erling G, Zhiqiang S, Yadong L (2015) From the Fourier transform of non-periodic signals to the Fourier series of periodic signals. J Electric Electron Educ 37:42–44

    Google Scholar 

  40. Zhou K, Kang Y (2005) Neural network model and its MATLAB simulation program design. Tsinghua University Press, Beijing

    Google Scholar 

  41. Zhang D (2010) MATLAB numerical calculation method. China Machine Press, Beijing

    Google Scholar 

  42. Suykens JAK, Gestel TV, Brabanter JD et al. Least squares support vector machines https://www.esat.kuleuven.be/stadius/lssvmlab/toolbox.html.

  43. Huang GB Extreme learning machine https://www.ntu.edu.sg/eee/icis/cv/egbhuang.html.

  44. Huang G, Zhu Q, Siew C (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of 2004 IEEE International Joint Conference on Neural Networks. Budapest, pp 985–990

  45. Anonymous NTC thermo-sensitive semiconductor resistors of Sinochip Electronics CO. LTD http://www.sinochip.net/list/?107_1.html.

Download references

Acknowledgements

This work was supported by the Science and Technology Research Foundation of the Education Department in Jiangxi Province (No. GJJ218503, GJJ219105).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lingyan Wu.

Ethics declarations

Conflict of interest

The authors declared that they have no conflicts of interest to this work. We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cai, G., Wu, L. Cycle sampling neural network algorithms and applications. J Supercomput 79, 9889–9914 (2023). https://doi.org/10.1007/s11227-022-05019-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-022-05019-9

Keywords

Navigation