Skip to main content
Log in

Performance of soft sensors based on stochastic configuration networks with nonnegative garrote

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this study, stochastic configuration networks (SCNs) and nonnegative garrote (NNG) algorithm are employed to develop a soft-sensing technique that infers difficult-to-measure variables with easy-to-measure variables in industrial processes. The proposed method consists of two stages, that is, performing industrial data modeling with SCNs and applying NNG algorithm for shrinking input weights and removing some redundant input variables from the well-trained leaner model. Cross-validation and the Akaike information criterion are employed to determine the optimal shrinkage parameter for the NNG. A numerical example and real industrial data are used to validate the performance of the proposed algorithm. Several state-of-the-art feature selection schemes for neural networks are tested. Comparative results demonstrate that the proposed soft-sensor outperforms others in terms of the prediction accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Yan W, Tang D, Lin Y (2016) A data-driven soft sensor modeling method based on deep learning and its application. IEEE Trans Ind Electron 64(5):4237–4245

    Article  Google Scholar 

  2. Matsuoka K (1992) Noise injection into inputs in back-propagation learning. IEEE Trans Syst Man Cybern 22(3):436–440

    Article  Google Scholar 

  3. Khosbayar A, Valluru J, Huang B (2021) Multi-rate Gaussian Bayesian network soft sensor development with noisy input and missing data. J Process Control 105:48–61

    Article  Google Scholar 

  4. Mehra R (1970) On the identification of variances and adaptive Kalman filtering. IEEE Trans Autom Control 15(2):175–184

    Article  MathSciNet  Google Scholar 

  5. Yuan X, Ge Z, Song Z, Wang Y, Yang C, Zhang H (2017) Soft sensor modeling of nonlinear industrial processes based on weighted probabilistic projection regression. IEEE Trans Instrum Meas 66(4):837–845

    Article  Google Scholar 

  6. Khatibisepehr S, Huang B, Khare S (2013) Design of inferential sensors in the process industry: A review of Bayesian methods. J Process Control 23(10):1575–1596

    Article  Google Scholar 

  7. Sharmin R, Sundararaj U, Shah S, Griend LV, Sun Y (2006) Inferential sensors for estimation of polymer quality parameters: Industrial application of a PLS-based soft sensor for a LDPE plant. Chem Eng Sci 61(19):6372–6384

    Article  Google Scholar 

  8. Zheng J, Song Z (2018) Semisupervised learning for probabilistic partial least squares regression model and soft sensor application. J Process Control 64:123–131

    Article  Google Scholar 

  9. Bidar B, Sadeghi J, Shahraki F, Khalilipour MM (2017) Data-driven soft sensor approach for online quality prediction using state dependent parameter models. Chemom Intell Lab Syst 162:130–141

    Article  Google Scholar 

  10. Gonzaga J, Meleiro LAC, Kiang C, Maciel Filho R (2009) ANN-based soft-sensor for real-time process monitoring and control of an industrial polymerization process. Comput Chem Eng 33(1):43–49

    Article  Google Scholar 

  11. Yao L, Ge Z (2018) Big data quality prediction in the process industry: A distributed parallel modeling framework. J Process Control 68:1–13

    Article  Google Scholar 

  12. Xu W, Zhang L, Gu X (2011) Soft sensor for ammonia concentration at the ammonia converter outlet based on an improved particle swarm optimization and BP neural network. Chem Eng Res Des 89(10):2102–2109

    Article  Google Scholar 

  13. Possetti GR, Terada GG, Daciuk RJ, Ofuchi CY, Neves F, Fabris JL, Muller M, Arruda LVR (2015) Optical-ultrasonic heterogeneous sensor based on soft-computing models. IEEE Trans Instrum Meas 64(9):2338–2346

    Article  Google Scholar 

  14. Beiroti A, Aghasadeghi MR, Hosseini SN, Norouzian D (2019) Application of recurrent neural network for online prediction of cell density of recombinant Pichia pastoris producing HBsAg. Prep Biochem Biotechnol 49(4):352–359

    Article  Google Scholar 

  15. Chang P, Li Z (2021) Over-complete deep recurrent neutral network based on wastewater treatment process soft sensor application. Appl Soft Comput 105:107227

    Article  Google Scholar 

  16. Yin X, Niu Z, He Z, Li ZS, Lee D (2020) Ensemble deep learning based semi-supervised soft sensor modeling method and its application on quality prediction for coal preparation process. Adv Eng Inform 46:101136

    Article  Google Scholar 

  17. Pan H, Su T, Huang X, Wang Z (2021) LSTM-based soft sensor design for oxygen content of flue gas in coal-fired power plant. Trans Inst Meas Control 43(1):78–87

    Article  Google Scholar 

  18. Shao W, Ge Z, Song Z, Wang K (2019) Nonlinear industrial soft sensor development based on semi-supervised probabilistic mixture of extreme learning machines. Control Eng Pract 91:104098

    Article  Google Scholar 

  19. Yi L, Lu J, Ding J, Liu C, Chai T (2020) Soft sensor modeling for fraction yield of crude oil based on ensemble deep learning. Chemom Intell Lab Syst 204:104087

    Article  Google Scholar 

  20. Tang L, Wu Y, Yu L (2018) A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting. Appl Soft Comput 70:1097–1108

    Article  Google Scholar 

  21. Wang D, Li M (2017) Stochastic configuration networks: Fundamentals and algorithms. IEEE transactions on cybernetics 47(10):3466–3479

    Article  Google Scholar 

  22. Wang W, Wang D (2020) Prediction of component concentrations in sodium aluminate liquor using stochastic configuration networks. Neural Comput Appl 32:13625–13638

    Article  Google Scholar 

  23. Dai W, Li D, Zhou P, Chai T (2019) Stochastic configuration networks with block increments for data modeling in process industries. Inf Sci 484:367–386

    Article  MathSciNet  Google Scholar 

  24. Souza FA, Araújo R, Matias T, Mendes J (2013) A multilayer-perceptron based method for variable selection in soft sensor design. J Process Control 23(10):1371–1378

    Article  Google Scholar 

  25. Cai J, Luo J, Wang S, Yang S (2018) Feature selection in machine learning: a new perspective. Neurocomputing 300:70–79

    Article  Google Scholar 

  26. Wang G, Awad OI, Liu S, Shuai S, Wang Z (2020) NOx emissions prediction based on mutual information and back propagation neural network using correlation quantitative analysis. Energy 198:117286

    Article  Google Scholar 

  27. Mursalin M, Zhang Y, Chen Y, Chawla NV (2017) Automated epileptic seizure detection using improved correlation-based feature selection with random forest classifier. Neurocomputing 241:204–214

    Article  Google Scholar 

  28. Romero E, Sopena JM (2008) Performing feature selection with multilayer perceptrons. IEEE Trans Neural Networks 19(3):431–441

    Article  Google Scholar 

  29. Liu B, Li S, Wang Y, Lu L, Li Y, Cai Y (2007) Predicting the protein SUMO modification sites based on Properties Sequential Forward Selection (PSFS). Biochem Biophys Res Commun 358(1):136–139

    Article  Google Scholar 

  30. Long J, Li T, Yang M, Hu G, Zhong W (2018) Hybrid strategy integrating variable selection and a neural network for fluid catalytic cracking modeling. Ind Eng Chem Res 58(1):247–258

    Article  Google Scholar 

  31. Fan Y, Tao B, Zheng Y, Jang S (2019) A data-driven soft sensor based on multilayer perceptron neural network with a double LASSO approach. IEEE Trans Instrum Meas 69(7):3972–3979

    Article  Google Scholar 

  32. Breiman L (1995) Better subset regression using the nonnegative garrote. Technometrics 37(4):373–384

    Article  MathSciNet  Google Scholar 

  33. Sun K, Liu J, Kang J, Jang S, Wong DS, Chen D (2014) Development of a variable selection method for soft sensor using artificial neural network and nonnegative garrote. J Process Control 24(7):1068–1075

    Article  Google Scholar 

  34. Sun K, Tseng C, Wong DS, Shieh S, Jang S, Kang J, Hsieh W (2015) Model predictive control for improving waste heat recovery in coke dry quenching processes. Energy 80:275–283

    Article  Google Scholar 

  35. Li M, Wang D (2017) Insights into randomized algorithms for neural networks: Practical issues and common pitfalls. Inf Sci 382:170–178

    Article  Google Scholar 

  36. Cimini G, Bemporad A (2017) Exact complexity certification of active-set methods for quadratic programming. IEEE Trans Autom Control 62(12):6094–6109

    Article  MathSciNet  Google Scholar 

  37. Bergmeir C, Hyndman RJ, Koo B (2018) A note on the validity of cross-validation for evaluating autoregressive time series prediction. Comput Stat Data Anal 120:70–83

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editors and reviewers for the time and effort they spent reviewing this paper as well as for their detailed and constructive comments for the paper improvement in terms of presentation and quality.

Funding

This work is funded in part by the Shandong Provincial Natural Science Foundation under grant ZR2021MF022, in part by the Key Research and Development Program of Shandong Province under Grant 2019GGX104037, and in part by the National Key Research and Development Program of China under Grant 2018AAA0100304.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Kai Sun or Dianhui Wang.

Ethics declarations

Conflict of interest

The authors declare that they have  no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Tables

Table 5 Parameters configuration of the artificial dataset

5 and

Table 6 Parameters configuration of the CDQ dataset

6 showed the settings of the important parameters of algorithms in the experiments based on the artificial dataset and the CDQ dataset, respectively. Para.1 is the number of bins when calculating the probability density function by histogram. During the calculation, samples were sorted into Para.1 equally spaced bins. Para.2 and Para.3 are the weights of the regular terms when LASSO is applied to the input and hidden layers, respectively. Para.4 is the garrote parameter s of NNG.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tian, P., Sun, K. & Wang, D. Performance of soft sensors based on stochastic configuration networks with nonnegative garrote. Neural Comput & Applic 34, 16061–16071 (2022). https://doi.org/10.1007/s00521-022-07254-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07254-w

Keywords

Navigation