Skip to main content
Log in

Parameter selection method for support vector machine based on adaptive fusion of multiple kernel functions and its application in fault diagnosis

  • Brain- Inspired computing and Machine learning for Brain Health
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

A new model parameter selection method for support vector machine based on adaptive fusion of multiple kernel functions is proposed in this paper. Characteristics of local kernels, global kernels, mixtures of kernels and multiple kernels were analyzed. Fusion coefficients of the multiple kernel function, kernel function parameters and regression parameters are combined to form the parameters of the state vector. Thus, the model selection problem is transformed into a nonlinear system state estimation problem. Then, we use a fifth-degree cubature Kalman filter to estimate the parameters. In this way, we realize adaptive selection of the multiple kernel function weighted coefficient, the kernel parameters and the regression parameters. A simulation experiment was performed to interpret the PE process for fault diagnosis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Vapnik VN (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  2. Wu J (2014) Efficient HIK SVM learning for image classification. IEEE Trans Image Process 21(10):4442–4453

    MathSciNet  MATH  Google Scholar 

  3. Liu XL, Ding SF, Zhu H (2010) Appropriateness in applying SVMs to text classification. Comput Eng Sci 32(6):106–108

    Google Scholar 

  4. Xie SQ, Sheng FM, Qiu XN (2009) Face recognition method based on support vector machine. Comput Eng 35(16):186–188

    Google Scholar 

  5. Xiao HJ, Wang XF, Hong F (2016) Attribute selection-based and support vector machine for anomaly detection. J Huazhong Univ Sci Technol (Nat Sci Ed) 36(3):99–102

    Google Scholar 

  6. Dileep AD, Sekhar CC (2009) Representation and feature selection using multiple kernel learning. In: Proceedings of international joint conference on neural networks, Atlanta, 14–19 June

  7. Lin YY, Liu TL, Fuh CS (2014) Local ensemble kernel learning for object category recognition. In: Proceedings of IEEE conference on computer vision and pattern recognition, Washington D. C. IEEE, pp 1–8

  8. Mak B, Kwok JT, Ho S (2014) A study of various composite kernels for kernel eigenvoice speaker adaptation. In: Proceedings of the IEEE international conference on acoustics, speech, and signal processing, Montreal. IEEE, pp 325–328

  9. Zhang N, Xia ZQ, Jiang H (2010) Prediction of runoff based on the multiple quantity index of SVM. J Hydraul Eng 40(11):1318–1324

    Google Scholar 

  10. Mu T, Nandi AK (2013) Automatic tuning of L2-SVM parameters employing the extended Kalman filter. Expert Syst 26(2):160–175

    Article  Google Scholar 

  11. Rakotomamonjy A, Bach FR, Canu S, Grandvalet Y (2015) Simple MKL. J Mach Learn Res 9(11):2491–2521

    Google Scholar 

  12. Bach FR (2008) Consistency of the group Lasso and multiple kernel learning. J Mach Learn Res 9(6):1179–1225

    MathSciNet  MATH  Google Scholar 

  13. Ong CS, Smola AJ, Williamson RC (2015) Learning the kernel with hyperkernels. J Mach Learn Res 6(7):1043–1071

    MathSciNet  MATH  Google Scholar 

  14. Jia B, Xin M, Cheng Y (2015) High-degree cubature Kalman filter. Automatica 49(2):510–518

    Article  MathSciNet  Google Scholar 

  15. Xu Z, Jin R, Yang H et al (2015) Simple and efficient multiple kernel learning by group lasso. In: Proc. of the 27th international conference on machine learning, Haifa, pp 1175–1182

  16. Lee WJ, Verzakov S, Duin RPW (2013) Kernel combination versus classifier combination. In: Proceedings of the multiple classifier systems. Springer, Berlin, pp 22–31

  17. Bach FR, Lanckriet GRG, Jordon MI (2004) Multiple kernel learning, conic duality, and the SMO algorithm. In: Proc of the 21st international conference on machine learning, Banff, pp 41–48

  18. Sonnenburg S, Ratsch G, Schafer C et al (2006) Large scale multiple kernel learning. J Mach Learn Res 7(1):1531–1565

    MathSciNet  MATH  Google Scholar 

  19. Rakotomamonjy A, Bach F, Canu S et al (2008) Simple MKL. J Mach Learn Res 9:2491–2521

    MathSciNet  Google Scholar 

  20. Kloft M, Brefeld U, Laskov P et al (2008) Non-sparse multiple kernel learning. In: Proc of the NIPS workshop on kernel learning: automatic selection of optimal kernels

  21. Kloft M, Brefeld U, Sonnenburg S et al (2009) Efficient and accurate L p-norm multiple kernel learning. In: Advance in neural information processing systems vol 22, pp 997–1005

  22. Nath JS, Dinesh G, Raman S et al (2009) On the algorithmics and applications of a mixed-norm based kernel learning formulation. In: Advances in neural information processing systems vol 22, pp 844–852

  23. Cortes C, Mohri M, Rostamizadeh A (2014) Learning non-linear combinations of kernels. In: Advances in neural information processing systems vol 22, pp 396–404

  24. Mu S, Tian S, Yin C (2015) Multiple kernel learning based on cooperative clustering. J Beijing Jiaotong Univ 32(2):10–13

    Google Scholar 

  25. Wang HQ, Sun FC, Cai YN et al (2016) On multiple kernel learning methods. Acta Automatica Sinica 36(8):1037–1050

    Article  MathSciNet  Google Scholar 

  26. Qiu SB, Lane T (2012) Multiple kernel support vector regression for RNA efficacy prediction. In: Proceedings of the 4th international conference on bioinformatics research and applications, Atlanta. Springer, pp 367–378

  27. Bosch A, Zisserman A, Munoz X (2014) Representing shape with a spatial pyramid kernel. In: Proceedings of the 6th ACM international conference on image and video retrieval, Amsterdam. ACM, pp 401–408

  28. Liu Y (2015) Study on kernel function of support vector machine. Ph.D. dissertation, Xidian University, China

  29. Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol (TIST) 2(3):27

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Natural Science Foundation of China (61403229, 61503213) and Zhejiang Provincial Natural Science Foundation of China (LY13F030011, LQ17F030005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hailun Wang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, H., Xu, D. & Martinez, A. Parameter selection method for support vector machine based on adaptive fusion of multiple kernel functions and its application in fault diagnosis. Neural Comput & Applic 32, 183–193 (2020). https://doi.org/10.1007/s00521-018-3792-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-018-3792-7

Keywords

Navigation