Abstract
Recently, SVMs are wildly applied to regression estimation, but the existing algorithms leave the choice of the kernel type and kernel parameters to the user. This is the main reason for regression performance degradation, especially for the complicated data even the nonlinear and non-stationary data. By introducing the ‘empirical mode decomposition (EMD)’ method, with which any complicated data set can be decomposed into a finite and often small number of ‘intrinsic mode functions’ (IMFs) based on the local characteristic time scale of the data, this paper propose an important extension to the SVM method: multi-scale support vector machine based on EMD, in which several kernels of different scales can be used simultaneously to approximate the target function in different scales. Experiment results demonstrate the effectiveness of the proposed method.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1999)
Tan, Y., Wang, J.: A Support Vector Machine with a Hybrid Kernel. IEEE Transactions on Knowledge and Data Engineering 16, 385–395 (2004)
Schoelkopf, B., Smola, A., Mueller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10, 1299–1319 (1998)
Shao, X., Cherkassky, V.: Multi-Resolution Support Vector Machine. International Joint Conference on Neural Networks 2, 1065–1070 (1999)
Huang, N.E., Shen, Z., Long, S.R., et al.: The Empirical Mode Decomposition and the Hilbert Spectrum for Nonlinear and Non-Stationary Time Series Analysis, pp. 903–995. The Royal Society (1998)
Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery, 121–167 (1998)
Zhang, L., Zhou, W., Jiao, L.: Wavelet Support Vector Machine. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 34–39 (2004)
Poggio, T., Rifkin, R., Mukherjee, S., Niyogi, P.: General Conditions for Predictivity in Learning Theory. NATURE 428, 419–422 (2004)
Chen, S., Hong, X., Harris, C.J.: Sparse Kernel Density Construction Using Orthogonal Forward Regression with Leave-One-Out Test Score and Local Regularization. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 1708–1717 (2004)
Joachims, T.: Estimating the Generalization Performance of a SVM Efficiently. In: Proc. 17th Int. Conf. Machine Learning, San Francisco (2000)
Kearns, M., Ron, D.: Algorithmic Stability and Sanity-Check Bounds for Leave-one-out Cross Validation. In: Proc. Tenth Conf. Compute Learning Theory, pp. 152–162 (1997)
Zheng, C., Jiao, L.: Automatic Parameters Selection for SVM Based on GA. In: Proceedings of the Fifth World Congress on Intelligent Control and Automation, pp. 1869–1872 (2004)
Liao, S.P., Lin, H.T., Lin, C.J.: A Note on the Decomposition Methods for Support Vector Regression. Neural Networks, 1474–1479 (2001)
Li, J., Zhang, B., Lin, F.: Nonlinear Speech Model Based on Support Vector Machine and Wavelet Transform. Tools with Artificial Intelligence, 259–263 (2003)
Chen, C.H., Li, C.P., Teng, T.L.: Surface-Wave Dispersion Measurements Using Hilbert-Huang Transform. Terrestrial, Atmospheric and Oceanic Sciences (TAO) 13, 171–184 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yang, Z., Guo, J., Xu, W., Nie, X., Wang, J., Lei, J. (2006). Multi-scale Support Vector Machine for Regression Estimation. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_151
Download citation
DOI: https://doi.org/10.1007/11759966_151
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34439-1
Online ISBN: 978-3-540-34440-7
eBook Packages: Computer ScienceComputer Science (R0)