Skip to main content

Advertisement

Log in

Multi-objective Optimization Based Recursive Feature Elimination for Process Monitoring

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Process monitoring helps to estimate the quality of the end products, equipment health parameters, and operational reliability of chemical processes. This is an area in which data-driven approaches are widely used by academic and industrial practitioners. With the ever-increasing complexities in process industries, there is an increased thrust in developing the process monitoring methods of generic nature which are capable of handling the inherent nonlinear characteristics of the chemical process. This demanded the employment of complex data-driven model paradigms in the process monitoring framework. To circumvent the issues related to high-dimensional process data, a large body of these process monitoring algorithms extract only relevant features during the training. Further, model complexity is another important issue that needs to be accounted while employing these methods. In this work, an optimization-based features selection method for process monitoring is proposed, that simultaneously trades-off between the optimal feature selection and the resulting model complexity, by means of solving a multi-objective optimization problem. Particularly, this paper focuses on combining neural network architecture with recursive feature elimination and genetic algorithm to obtain an improved identification accuracy while reducing the number of variables to be measured continuously in the process plant. The efficacy of the proposed approach was validated using a basic numerical case and tested upon the operational data collected from the benchmark Tennessee Eastman plant data, and steel plates manufacturing case studies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Qin SJ (2003) Statistical process monitoring: basics and beyond. J Chem J Chemomet Soc 17(8–9):480–502

    Google Scholar 

  2. Qin SJ (2012) Survey on data-driven industrial process monitoring and diagnosis. Ann Rev Control 36(2):220–234

    Article  Google Scholar 

  3. Russell EL, Chiang LH, Braatz RD (2012) Data-driven methods for fault detection and diagnosis in chemical processes. Springer, Berlin

    Google Scholar 

  4. Li G, Qin SJ, Zhou D (2010) Geometric properties of partial least squares for process monitoring. Automatica 46(1):204–210

    Article  MathSciNet  MATH  Google Scholar 

  5. Chiang LH, Kotanchek ME, Kordon AK (2004) Fault diagnosis based on fisher discriminant analysis and support vector machines. Comput Chem Eng 28(8):1389–1401

    Article  Google Scholar 

  6. Wise BM, Gallagher NB (1996) The process chemometrics approach to process monitoring and fault detection. J Process Control 6(6):329–348

    Article  Google Scholar 

  7. Kodamana H, Raveendran R, Huang B (2017) Mixtures of probabilistic PCA with common structure latent bases for process monitoring. IEEE Trans Control Syst Technol 27(2):838–846

    Article  Google Scholar 

  8. Raveendran R, Kodamana H, Huang B (2018) Process monitoring using a generalized probabilistic linear latent variable model. Automatica 96:73–83

    Article  MathSciNet  MATH  Google Scholar 

  9. Ghosh K, Ramteke M, Srinivasan R (2014) Optimal variable selection for effective statistical process monitoring. Comput Chem Eng 60:260–276

    Article  Google Scholar 

  10. Ge Z, Yang C, Song Z (2009) Improved kernel PCA-based monitoring approach for nonlinear processes. Chem Eng Sci 64(9):2245–2255

    Article  Google Scholar 

  11. Choi SW, Lee C, Lee JM, Park JH, Lee IB (2005) Fault detection and identification of nonlinear processes based on kernel PCA. Chemometr Intell Lab Syst 75(1):55–67

    Article  Google Scholar 

  12. Jack L, Nandi A (2002) Fault detection using support vector machines and artificial neural networks, augmented by genetic algorithms. Mech Syst Signal Process 16(2–3):373–390

    Article  Google Scholar 

  13. Nashalji MN, Shoorehdeli MA, Teshnehlab M (2010) Fault detection of the Tennessee Eastman process using improved PCA and neural classifier. In: Soft computing in industrial applications. Springer, Berlin, pp 41–50

  14. Zaman B, Riaz M, Ahmad S, Abbasi SA (2015) On artificial neural networking-based process monitoring under bootstrapping using runs rules schemes. Int J Adv Manuf Technol 76(1–4):311–327

    Article  Google Scholar 

  15. Wu D, Gu Y, Luo D, Yang Q (2020) Fault diagnosis of TE process based on incremental learning. In: Fu J, Sun J (eds) 2020 39th Chinese control conference (CCC). IEEE, New York, pp 4227–4232

  16. Han Y, Ding N, Geng Z, Wang Z, Chu C (2020) An optimized long short-term memory network based fault diagnosis model for chemical processes. J Process Control 92:161–168

    Article  Google Scholar 

  17. Wang Y, Pan Z, Yuan X, Yang C, Gui W (2020) A novel deep learning based fault diagnosis approach for chemical process with extended deep belief network. ISA Trans 96:457–467

    Article  Google Scholar 

  18. Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28

    Article  Google Scholar 

  19. Kira K, Rendell LA (1992) A practical approach to feature selection. In: Edwards P, Sleeman D (eds) Machine learning proceedings. Elsevier, London, pp 249–256

  20. Deb K (2001) Multi-objective optimization using evolutionary algorithms, vol 16. Wiley, London

    MATH  Google Scholar 

  21. Guyon I, Weston J, Barnhill S, Vapnik V (2002) Gene selection for cancer classification using support vector machines. Mach Learn 46(1–3):389–422

    Article  MATH  Google Scholar 

  22. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288

    MathSciNet  MATH  Google Scholar 

  23. Granitto PM, Furlanello C, Biasioli F, Gasperi F (2006) Recursive feature elimination with random forest for PTR-MS analysis of agroindustrial products. Chemometr Intell Lab Syst 83(2):83–90

    Article  Google Scholar 

  24. Chen Q, Meng Z, Liu X, Jin Q, Su R (2018) Decision variants for the automatic determination of optimal feature subset in RF-RFE. Genes 9(6):301

    Article  Google Scholar 

  25. Sun L, Yin T, Ding W, Qian Y, Xu J (2020) Multilabel feature selection using ml-Relieff and neighborhood mutual information for multilabel neighborhood decision systems. Inf Sci 537:401–424

    Article  MathSciNet  Google Scholar 

  26. Bocca FF, Rodrigues LHA (2016) The effect of tuning, feature engineering, and feature selection in data mining applied to rainfed sugarcane yield modelling. Comput Electron Agric 128:67–76

    Article  Google Scholar 

  27. Zhang X, Zhang Q, Chen M, Sun Y, Qin X, Li H (2018) A two-stage feature selection and intelligent fault diagnosis method for rotating machinery using hybrid filter and wrapper method. Neurocomputing 275:2426–2439

    Article  Google Scholar 

  28. Liu P, Li B, Han C, Wang F (2016) Feature extraction and selection scheme for intelligent engine fault diagnosis based on 2DNMF, mutual information, and NSGA-II. Shock Vib 2016:13

    Google Scholar 

  29. Stief A, Ottewill JR, Baranowski J (2019) Investigation of the diagnostic properties of sensors and features in a multiphase flow facility case study. In: 12th IFAC symposium on dynamics and control of process systems

  30. Reddy TR, Vardhan BV, GopiChand M, Karunakar K (2018) Gender prediction in author profiling using Relieff feature selection algorithm. In: Bhateja V, Coello Coello CA, Satapathy SC, Pattnaik PK (eds) Intelligent engineering informatics. Springer, New York, pp 169–176

  31. Coelho F, Costa M, Verleysen M, Braga AP (2020) Lasso multi-objective learning algorithm for feature selection. Soft Comput 2020:1–9

    Google Scholar 

  32. Arabasadi Z, Alizadehsani R, Roshanzamir M, Moosaei H, Yarifard AA (2017) Computer aided decision making for heart disease detection using hybrid neural network-genetic algorithm. Comput Methods Programs Biomed 141:19–26

    Article  Google Scholar 

  33. Xue Y, Zhang L, Wang B, Zhang Z, Li F (2018) Nonlinear feature selection using gaussian kernel SVM-RFE for fault diagnosis. Appl Intell 48(10):3306–3331

    Article  Google Scholar 

  34. Onel M, Kieslich CA, Pistikopoulos EN (2019) A nonlinear support vector machine-based feature selection approach for fault detection and diagnosis: application to the tennessee eastman process. AIChE J 65(3):992–1005

    Article  Google Scholar 

  35. Rad MAA, Yazdanpanah MJ (2015) Designing supervised local neural network classifiers based on EM clustering for fault diagnosis of Tennessee Eastman process. Chemometr Intell Lab Syst 146:149–157

    Article  Google Scholar 

  36. Vahed SH, Mokhtare M, Nozari HA, Shoorehdeli MA, Simani S (2010) Fault detection and isolation of Tennessee Eastman process using improved RBF network by genetic algorithm. In: Simani S (ed) Proceedings of the 8th European workshop on advanced control and diagnosis—ACD2010, no. FrA3, vol 6, pp 362–367

  37. Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Disc 2(2):121–167

    Article  Google Scholar 

  38. Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  MATH  Google Scholar 

  39. Yang SL, Weng W, Rong G, Feng YP (2017) Multiple kernel learning based feature selection for process monitoring. In: 2017 IEEE/ACIS 16th international conference on computer and information science (ICIS). IEEE, New York, pp 809–814

  40. Goodfellow I, Bengio Y, Courville A, Bengio Y (2016) Deep learning, vol 1. MIT Press, Cambridge

    MATH  Google Scholar 

  41. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. Preprint arXiv:14126980

  42. Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197

    Article  Google Scholar 

  43. Ricker N, Lee J (1995) Nonlinear modeling and state estimation for the Tennessee Eastman challenge process. Comput Chem Eng 19(9):983–1005

    Article  Google Scholar 

  44. Downs JJ, Vogel EF (1993) A plant-wide industrial process control problem. Comput Chem Eng 17(3):245–255

    Article  Google Scholar 

  45. Bathelt A, Ricker NL, Jelali M (2015) Revision of the Tennessee Eastman process model. IFAC Pap Online 48(8):309–314

    Article  Google Scholar 

  46. Yadav A, Ramteke M, Pant HJ, Roy S (2017) Monte Carlo real coded genetic algorithm (MC-RGA) for radioactive particle tracking (RPT) experimentation. AIChE J 63(7):2850–2863

    Article  Google Scholar 

  47. Buscema M, Tastle W (2010) (2010) A new meta-classifier. In: Fuzzy Information Processing Society (NAFIPS). Annual meeting of the North American. IEEE, New York, pp 1–7

  48. Fakhr M, Elsayad AM (2012) Steel plates faults diagnosis with data mining models. J Comput Sci 8(4):506

    Article  Google Scholar 

  49. Nguyen D, Bagajewicz M (2010) Optimization of preventive maintenance in chemical process plants. Ind Eng Chem Res 49(9):4329–4339

    Article  Google Scholar 

Download references

Acknowledgements

Hariprasad Kodamana would like to acknowledge the New Faculty Seed Grant from IIT Delhi. Hariprasad Kodamana and Manojkumar Ramteke would like to acknowledge the SERB CORE RESEARCH GRANT with file number CRG/2018/001555.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manojkumar Ramteke.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Singh, S., Agrawal, A., Kodamana, H. et al. Multi-objective Optimization Based Recursive Feature Elimination for Process Monitoring. Neural Process Lett 53, 1081–1099 (2021). https://doi.org/10.1007/s11063-021-10430-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-021-10430-z

Keywords

Navigation