Skip to main content
Log in

Quantum clustering-based weighted linear programming support vector regression for multivariable nonlinear problem

  • Original Paper
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Linear programming support vector regression shows improved reliability and generates sparse solution, compared with standard support vector regression. We present the v-linear programming support vector regression approach based on quantum clustering and weighted strategy to solve the multivariable nonlinear regression problem. First, the method applied quantum clustering to variable selection, introduced inertia weight, and took prediction precision of v-linear programming support vector regression as evaluation criteria, which effectively removed redundancy feature attributes and also reduced prediction error and support vectors. Second, it proposed a new weighted strategy due to each data point having different influence on regression model and determined the weighted parameter p in terms of distribution of training error, which greatly improved the generalization approximate ability. Experimental results demonstrated that the proposed algorithm enabled the mean squared error of test sets of Boston housing, Bodyfat, Santa dataset to, respectively, decrease by 23.18, 78.52, and 41.39%, and also made support vectors degrade rapidly, relative to the original v-linear programming support vector regression method. In contrast with other methods exhibited in the relevant literatures, the present algorithm achieved better generalization performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Asuncion A, Newman DJ (2007) UCI machine learning repository. School of Information and Computer Sciences, University of California, Irvine. http://mlearn.ics.uci.edu/MLRepository.html. Accessed 6 Mar 2007

  • Aïmeur E, Brassard G, Gambs S (2007) Quantum clustering algorithms. In: Proceedings of the 24th international conference on machine learning, vol 227. pp 1–8

  • Butterworth R, Piatetsky-Shapiro G, Simovici DA (2005) On feature selection through clustering. Fifth IEEE Int Conf Data Min 11:27–30

    Google Scholar 

  • Cao LJ, Chua KS, Chong WK et al (2003) A comparison of PCA, KPCA, and ICA for dimensionality reduction in support vector machine. Neurocomputing 55(1–2):321–336

    Google Scholar 

  • Chen YW, Lin CJ (2006) Combining SVMs with various feature selection strategies. In: Guyon I, Gunn S, Nikravesh M et al (ed) Feature extraction: foundations and applications. Springer, Berlin, pp 315–324

    Chapter  Google Scholar 

  • Corsini P, Lazzerini B, Marcelloni F (2005) A new fuzzy relational clustering algorithm based on the fuzzy C-means algorithm. Soft Comput 9(6):439–447

    Article  Google Scholar 

  • Cunningham P (2007) Dimension reduction. Technical Report UCD-CSI-2007-7

  • Drucker H, Burges CJC, Kaufman H et al (1996) Support vector regression machines. Adv Neural Inf Process Syst 9:155–161

    Google Scholar 

  • Dy JG, Brodley CE (2004) Feature selection for unsupervised learning. J Mach Learn Res 5:845–889

    MathSciNet  Google Scholar 

  • Fodor IK (2002) A survey of dimension reduction techniques. Technical Report UCRL-ID-148494

  • Gershenfeld N, Weigend A (1994) The Santa Fe time series competition data. http://www-psych.stanford.edu/~andreas/Time-Series/SantaFe.html. Accessed 10 May 2008

  • Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182

    Article  MATH  Google Scholar 

  • Horn D, Gottlieb A (2001) The method of quantum clustering. Proc Neural Inf Process Syst 14:769–776

    Google Scholar 

  • Horn D, Gottlieb A (2002) Algorithm for data clustering in pattern recognition problems based on quantum mechanics. Phys Rev Lett 88(1):018702

    Article  Google Scholar 

  • Hyvärinen A, Oja E (2000) Independent component analysis: algorithms and applications. Neural Netw 13(4–5):411–430

    Article  Google Scholar 

  • Jolliffe IT (2002) Principal component analysis, 2nd edn. Springer, New York

    MATH  Google Scholar 

  • Mangasarian OL, David R (2002) Large scale kernel regression via linear programming. Mach Learn 46(1–3):255–269

    Article  MATH  Google Scholar 

  • Pedroso JP, Murata N (1999) Support vector machines for linear programming: motivation and formulations. BSIS Technical Report 99-2, Riken Brain Science Institute, Wako-shi, Saitatma, Japan

  • Reynolds RG (2002) Cultural algorithms: a tutorial. Wayne State University, Detroit

    Google Scholar 

  • Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge

    Google Scholar 

  • Schölkopf B, Bartlett P, Smola A et al (1998) Shrinking the tube: a new support vector regression algorithm. Adv Neural Inf Process Syst 11:330–336

    Google Scholar 

  • Schölkopf B, Smola A, Müller K (1999) Kernel principal component analysis. In: Schölkopf B, Burges C, Smola A (ed) Advances in kernel methods: support vector learning. MIT Press, Cambridge, pp 327–352

    Google Scholar 

  • Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222

    Article  MathSciNet  Google Scholar 

  • Smola A, Schölkopf B, Rätsch G (1999) Linear programs for automatic accuracy control in regression. Ninth Int Conf Artif Neural Netw 2:575–580

    Article  Google Scholar 

  • Suykens JAK, De Brabanter J, Lukas L et al (2000) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48(1–4):85–105

    Google Scholar 

  • Suykens JAK, Gestel TV, Vandewalle J et al (2003) A support vector machine formulation to PCA analysis and its kernel version. IEEE Trans Neural Netw 14(2):447–450

    Article  Google Scholar 

  • Vapnik V (1995) The nature of statistical learning theory. Springer, Berlin

    MATH  Google Scholar 

  • Vapnik V (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  • Vapnik V, Golowich S, Smola A (1996) Support vector method for function approximation, regression estimation, and signal processing. Adv Neural Inf Process Syst 9:281–287

    Google Scholar 

  • Vlachos P (2005) StatLib—datasets archive. http://lib.stat.cmu.edu/datasets. Accessed 6 Mar 2007

  • Zheng WM, Zou CR, Zhao L (2005) An improved algorithm for kernel principal component analysis. Neural Process Lett 22(1):49–56

    Article  Google Scholar 

  • Zhou W, Zhang L, Jiao L (2002) Linear programming support vector machines. Pattern Recognit 35(12):2927–2936

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This paper was supported by National Science Foundation for Distinguished Young Scholars (No. 60625302), National Natural Science Foundation of China (General Program) (No. 60704028), Program for Changjiang Scholars and Innovative Research Team in University (No.IRT0721), the 111 Project (No.B08021), Shanghai Leading Academic Discipline Project (No.B504).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Feng Qian.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yu, Y., Qian, F. & Liu, H. Quantum clustering-based weighted linear programming support vector regression for multivariable nonlinear problem. Soft Comput 14, 921–929 (2010). https://doi.org/10.1007/s00500-009-0478-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-009-0478-1

Keywords

Navigation