Skip to main content

Hierarchical Modified Regularized Least Squares Fuzzy Support Vector Regression through Multiscale Approach

  • Conference paper
Advances in Computational Intelligence (IWANN 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7902))

Included in the following conference series:

Abstract

Support vector regression (SVR) is a promising regression tool based on support vector machine (SVM). It is a paradigm for identifying estimated models that are based on minimizing Vapnik’s loss function of residuals. It is based on linear combination of displaced replicas of kernel function. Single kernel is ineffective when function approximated is non stationary. This problem is taken care of by hierarchical modified regularized least squares fuzzy support vector regression (HMRLFSVR). It is developed from modified regularized least squares fuzzy support vector regression (MRLFSVR) and regularized least squares fuzzy support vector regression (RLFSVR). HMRLFSVR consists of a set of hierarchical layers each containing MRLFSVR with Gaussian kernel at given scale. On increasing scale layer by layer details are incorporated inside regression function. It adapts local scale to data keeping number of support vectors and configuration time comparable with classical SVR. It considers disadvantages when approximating non stationary function using single kernel approach where it is not able to follow variations in frequency content in different regions of input space. The approach is based on interleaving regression estimate with pruning activity. It denoises original data obtaining an effective multiscale reconstruction. The tuning of SVR configuration parameters becomes simplified in HMRLFSVR. Favourable results over noisy synthetic and real datasets are obtained when compared with multikernel approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  2. Vapnik, V.N.: The Natural of Statistical Learning Theory. Springer, New York (1995)

    Book  Google Scholar 

  3. Burges, C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)

    Article  Google Scholar 

  4. Smola, A.J., Schölkopf, B.: A Tutorial on Support Vector Regression. Statistics and Computing 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  5. Lanckriet, G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the Kernel Matrix with Semi Definite Programming. Journal of Machine Learning Research 5, 27–72 (2004)

    MATH  Google Scholar 

  6. Wang, Z., Chen, S., Sun, T.: MultiK-MHKS: A Novel Multiple Kernel Learning Algorithm. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(2), 348–353 (2008)

    Article  Google Scholar 

  7. Gönen, M., Alpaydin, E.: Localized Multiple Kernel Learning. In: 25th International Conference on Machine Learning, pp. 352–359 (2008)

    Google Scholar 

  8. Gönen, M., Alpaydin, E.: Localized Multiple Kernel Regression. In: 20th IAPR International Conference Pattern Recognition, pp. 1425–1428 (2010)

    Google Scholar 

  9. Zheng, D., Wang, J., Zhao, Y.: Non-flat Function Estimation with a Multi-scale Support Vector Regression. Neurocomputing 70(1-3), 420–429 (2006)

    Article  Google Scholar 

  10. Peng, H., Wang, J.: Nonlinear System Identification based on Multiresolution Support Vector Regression. In: International Conference on Neural Networks and Brain, vol. 1, pp. 240–243 (2005)

    Google Scholar 

  11. Moody, J.E.: Fast Learning in Multi-resolution Hierarchies. In: Neural Information Processing Systems, pp. 29–39. Morgan Kaufmann, San Francisco (1988)

    Google Scholar 

  12. Ferrari, S., Maggioni, M., Borghese, N.A.: Multi-scale Approximation with Hierarchical Radial Basis Functions Networks. IEEE Transactions on Neural Networks 15(1), 178–188 (2004)

    Article  Google Scholar 

  13. Ferrari, S., Bellocchio, F., Piuri, V., Borghese, N.A.: A Hierarchical RBF Online Learning Algorithm for Real Time 3D Scanner. IEEE Transactions on Neural Networks 21(2), 275–285 (2010)

    Article  Google Scholar 

  14. Reddy, C.K., Park, J.-H.: Multi-resolution Boosting for Classification and Regression Problems. In: Theeramunkong, T., Kijsirikul, B., Cercone, N., Ho, T.-B. (eds.) PAKDD 2009. LNCS, vol. 5476, pp. 196–207. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  15. Steinke, F., Schölkopf, B., Blanz, V.: Support Vector Machines for 3D Shape Processing. Computer Graphics Forum 24(3), 285–294 (2005)

    Article  Google Scholar 

  16. Schapire, R.E.: A Brief Introduction to Boosting. In: International Joint Conference on Artificial Intelligence, pp. 1401–1406 (1999)

    Google Scholar 

  17. Freund, Y., Schapire, R.E.: A Decision-theoretic Generalization of On-line Learning and an Application to Boosting. Journal of Computer and System Sciences 55, 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  18. Duffy, N., Helmbold, D.: Boosting Methods for Regression. Machine Learning 47, 153–200 (2002)

    Article  MATH  Google Scholar 

  19. Liang, X.: An Effective Method of Pruning Support Vector Machine Classifiers. IEEE Transactions on Neural Networks 21(1), 26–38 (2010)

    Article  Google Scholar 

  20. Fung, G.M., Mangasarian, O.L., Smola, A.J.: Minimal Kernel Classifiers. Journal of Machine Learning Research 3, 2303–2321 (2002)

    Google Scholar 

  21. Keerthi, S.S., Chapelle, O., Coste, D.D.: Building Support Vector Machines with Reduced Classifier Complexity. Journal of Machine Learning Research 7, 1493–1515 (2006)

    MATH  Google Scholar 

  22. Guo, J., Takahashi, N., Nishi, T.: An Efficient Method for Simplifying Decision Functions of Support Vector Machines. IEICE Transactions on Fundamentals of Electronics, Communications and Computer Science E89-A(10), 2795–2802 (2006)

    Google Scholar 

  23. Zeng, X., Chen, X.: SMO-based Pruning Methods for Sparse Least Squares Support Vector Machines. IEEE Transactions on Neural Networks 16(6), 1541–1546 (2005)

    Article  Google Scholar 

  24. Nguyen, D., Ho, T.: An Efficient Method for Simplifying Support Vector Machines. In: 22nd International Conference on Machine Learning, pp. 617–624 (2005)

    Google Scholar 

  25. Chaudhuri, A.: Forecasting Rice Production in West Bengal State in India: Statistical vs. Computational Intelligence Techniques. International Journal of Agricultural and Environmental Information Systems 4(2) (in press, 2013)

    Google Scholar 

  26. Khemchandani, R., Jayadeva, Chandra, S.: Regularized Least Squares Fuzzy Support Vector Regression for Financial Time Series Forecasting. Expert Systems with Applications 36(1), 132–138 (2009)

    Article  MathSciNet  Google Scholar 

  27. Saunders, C., Gammerman, A., Vovk, V.: Ridge Regression Learning Algorithm in Dual Variables. In: 15th International Conference on Machine Learning, pp. 515–521. Madison, Wisconsin (1998)

    Google Scholar 

  28. Gunn, S.R.: Support Vector Machines for Classification and Regression. School of Electronics and Computer Science, University of Southampton, Southampton, Technical Report (1998)

    Google Scholar 

  29. Fung, G., Mangasarian, O.L.: Proximal Support Vector Machine Classifiers. In: International Conference of Knowledge Discovery and Data Mining, pp. 77–86. Association for Computing Machinery, New York (2001)

    Google Scholar 

  30. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The Johns Hopkins University Press (1996)

    Google Scholar 

  31. Lee, Y.J., Mangasarian, O.L.: RSVM: Reduced Support Vector Machines. Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, Technical Report 00-07 (2000)

    Google Scholar 

  32. Brabanter, B., Lukas, L., Vandewalle, J.: Weighted Least Squares Support Vector Machines: Robustness and Sparse Approximation. Neurocomputing 48(1-4), 85–105 (2002)

    Article  MATH  Google Scholar 

  33. Chaudhuri, A., De, K.: Fuzzy Support Vector Machine for Bankruptcy Prediction. Applied Soft Computing 11(2), 2472–2486 (2011)

    Article  Google Scholar 

  34. Wang, D., Wu, X.B., Lin, D.M.: Two Heuristic Strategies for Searching Optimal Hyper parameters of C-SVM. In: 8th International Conference on Machine Learning and Cybernetics, pp. 3690–3695 (2009)

    Google Scholar 

  35. Tang, Y., Guo, W., Gao, J.: Efficient Model Selection for Support Vector Machine with Gaussian Kernel Function. In: IEEE Symposium on Computational Intelligence and Data Mining, pp. 40–45 (2009)

    Google Scholar 

  36. Smola, A.J., Murata, N., Schölkopf, B., Müller, K.R.: Asymptotically optimal choice of ε-loss for Support Vector Machines. In: 8th International Conference on Artificial Neural Networks, Perspectives on Neural Computing, pp. 105–110. Springer, Berlin (1998)

    Google Scholar 

  37. Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core Vector Machines: Fast SVM Training on very large Data Sets. Journal of Machine Learning Research 6, 363–392 (2005)

    MATH  MathSciNet  Google Scholar 

  38. Joachims, T.: Making Large Scale SVM Learning Practical. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning, ch. 11, pp. 169–184. MIT Press, Cambridge (1999)

    Google Scholar 

  39. Heteroscedastic Kernel Ridge Regression Demo, http://theoval.cmp.uea.ac.uk/matlab/hkrr_demo/hkrr_demo.m

  40. Qiu, S., Lane, T.: Multiple Kernel Learning for Support Vector Regression. Department of Computer Science, University of New Mexico, Albuquerque, Technical Report, TR-CS-2005-42 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chaudhuri, A. (2013). Hierarchical Modified Regularized Least Squares Fuzzy Support Vector Regression through Multiscale Approach. In: Rojas, I., Joya, G., Gabestany, J. (eds) Advances in Computational Intelligence. IWANN 2013. Lecture Notes in Computer Science, vol 7902. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38679-4_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-38679-4_39

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-38678-7

  • Online ISBN: 978-3-642-38679-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics