Skip to main content
Log in

An Optimal Weight Semi-Supervised Learning Machine for Neural Networks with Time Delay

  • Published:
Journal of Classification Aims and scope Submit manuscript

Abstract

In this paper, an optimal weight semi-supervised learning machine for a single-hidden layer feedforward network (SLFN) with time delay is developed. Both input weights and output weights of the SLFN are globally optimized with manifold regularization. By feature mapping, input vectors can be placed at the prescribed positions in the feature space in the sense that the separability of all nonlinearly separable patterns can be maximized, unlabeled data can be leveraged to improve the classification accuracy when labeled data are scarce, and a high degree of recognition accuracy can be achieved with a small number of hidden nodes in the SLFN. Some simulation examples are presented to show the excellent performance of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • M. Belkin, I. Matveeva, P. Niyogi, Regularization and semi-supervised learning on large graphs, in: Proceedings of 17th conference on learning theory (COLT), 2004.

  • Bishop, C. M. (1995). Neural networks for pattern recognition. New York: Oxford. University Press Inc.

    MATH  Google Scholar 

  • Chen, Y., Xue, A. K., Lu, R. Q., & Zhou, S. S. (2008). On robustly exponential stability of uncertain neutral systems with time-varying delays and nonlinear perturbations. Nonlinear Analysis, 68, 2464–2470.

    Article  MathSciNet  Google Scholar 

  • Chen, Y., Lu, R. Q., Zou, H. B., & Xue, A. K. (2014). Stability analysis for stochastic jump systems with time-varying delay. Nonlinear Analysis: Hybrid Systems, 14, 114–125.

    MathSciNet  MATH  Google Scholar 

  • L. Devroye, L. Gyorfi, And G. Lugosi, A probabilistic theory of pattern recognition. New York: Springer-Verlag, 1996.

    Book  Google Scholar 

  • Duda, R., & Hart, P. (1973). Pattern classification and scene analysis. New York: Wiley.

    MATH  Google Scholar 

  • Huang, G. B., Zhu, Q. Y., & Siew, C. K. (Dec. 2006). Extreme learning machine: theory and applications. Neurocomputing, 70(1–3), 489–501.

    Article  Google Scholar 

  • Huang, G., Song, S., Gupta, J. N. D., & Wu, C. (2014). Semi-supervised and unsupervised extreme learning machines. IEEE Transactions on Cybernetics, 44(12), 2405–2417.

    Article  Google Scholar 

  • Liu, J., Chen, Y., Liu, M., & Zhao, Z. (2011). SELM: semi-supervised ELM with application in sparse calibrated location estimation. Neurocomputing, 74(16), 2566–2572.

    Article  Google Scholar 

  • Man, Z., Lee, K., Wang, D. H., Cao, Z., & Miao, C. (2011). A new robust training algorithm for a class of single hidden layer neural networks. Neurocomputing, 74, 2491–2501.

    Article  Google Scholar 

  • Man, Z., Lee, K., Wang, D., Cao, Z., & Khoo, S. (2013). An optimal weight learning machine for handwritten digit image recognition. Signal Processing, 93(6), 1624–1638.

    Article  Google Scholar 

  • Vapnik, V. (1998). Statistical learning theory. New York: Wiley.

    MATH  Google Scholar 

  • Waibel, A., Hanazawa, T., Hinton, G., Shikano, K., & Lang, K. J. (1989). Phoneme recognition using time-delay neural networks. IEEE Transactions on Acoustics, Speech, and Signal Processing, 37(3), 328–339.

    Article  Google Scholar 

  • Z. G. Wu, H. Su, J Chu, W. N. Zhou. Improved delay-dependent stability condition of discrete recurrent neural networks with time-varying delays, IEEE Transactions on Neural Networks, 21(4): 692–697, 2010.

  • Wu, Z. G., Lam, J., Su, H. Y., & Chu, J. (2012). Stability and dissipativity analysis of static neural networks with time delay. IEEE Transactions on Neural Networks and Learning Systems, 23(2), 199–210.

    Article  Google Scholar 

Download references

Acknowledgments

The authors thank the anonymous referee for his careful reading of the manuscript and his fruitful comments and suggestions.

Funding

This research was partially supported by Zhejiang Provincial Natural Science Foundation of China under Grant No. LY18F030003 and Science & Technology Program of Lishui City under Grant No. 2017RC01.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chengbo Lu.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lu, C., Mei, Y. An Optimal Weight Semi-Supervised Learning Machine for Neural Networks with Time Delay. J Classif 37, 656–670 (2020). https://doi.org/10.1007/s00357-019-09352-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00357-019-09352-2

Keywords

Navigation