Elsevier

Image and Vision Computing

Volume 27, Issue 8, 2 July 2009, Pages 1040-1046
Image and Vision Computing

Auto-correlation wavelet support vector machine

https://doi.org/10.1016/j.imavis.2008.09.006Get rights and content

Abstract

A support vector machine (SVM) with the auto-correlation of a compactly supported wavelet as a kernel is proposed in this paper. The authors prove that this kernel is an admissible support vector kernel. The main advantage of the auto-correlation of a compactly supported wavelet is that it satisfies the translation invariance property, which is very important for its use in signal processing. Also, we can choose a better wavelet by selecting from different wavelet families for our auto-correlation wavelet kernel. This is because for different applications we should choose wavelet filters selectively for the autocorrelation kernel. We should not always select the same wavelet filters independent of the application, as we demonstrate. Experiments on signal regression and pattern recognition show that this kernel is a feasible kernel for practical applications.

Introduction

The support vector machine (SVM) was first developed by Vapnik for pattern recognition and function regression. It has been applied with great success in many application domains such as handwritten digit recognition, image classification, face detection, object detection, text classification, etc. [1], [2], [3]. An SVM, as typically defined and applied, assumes that all samples in the training set are independent and identically distributed. It uses an approximate implementation of the structure risk minimization principal in statistical learning theory, rather than the empirical risk minimization method. The key operating principle is that a kernel is used to map the input data into a higher dimensional feature space so that the (classification) problem becomes linearly separable. The kernel that is used for this plays a very important role in the performance of the SVM application. The most popular kernels are the Gaussian kernel, the polynomial kernel, the exponential radial basis function kernel, and the spline kernel, among others.

Over the past decade, wavelet transforms have received substantial attention from researchers in numerous application areas. Both discrete and continuous wavelet transforms have shown great promises in such diverse fields as pattern recognition, image noise suppression, signal processing, image compression, and computer graphics, to name only a few. Chen and Xie [4] proposed two SVM kernels by using multiwavelet functions. Zhang et al. [5] proposed the scalar wavelet kernel for SVMs and they found that it outperforms the Gaussian kernel for function regression and pattern recognition. The scalar wavelet they used is defined as ψ(x)=cos(1.75x)e-x2/2. However, there are many other kinds of wavelets that provide good properties as well. Is the wavelet used in [5] well suited to SVM applications? The answer is probably not. Since the general wavelet function may not be an admissible kernel for SVM, it is natural to use the auto-correlation of a wavelet to build SVM kernels. For the basis of wavelets and their applications, the reader is referred to [7], [8], [9], [10], [11], [12].

In this paper, we propose to use the auto-correlation of compactly supported wavelets to construct SVM kernels and apply these to signal regression and pattern recognition. It is proven that this kernel is an admissible support vector kernel. Experimental results in this paper demonstrate that our auto-correlation wavelet kernel outperforms the scalar wavelet kernel, the Gaussian kernel, and the exponential radial basis function kernel for signal regression using several prototypical examples of real life signals. For pattern recognition, we conduct experiments illustrating their use in recognizing handwritten numerals and obtain good recognition rates without exotic algorithmic embellishments. Our results are better than the recognition rates demonstrated by Chen et al. [7]. This is an indication that our proposed autocorrelation wavelet kernel is a feasible kernel for practical applications. Note that this paper is an extension of our earlier conference paper [6].

The organization of this paper is as follows. Section 2 reviews the SVM for pattern recognition and function regression. Section 3 explains our auto-correlation wavelet SVM. Section 4 conducts some experiments for pattern recognition and function regression and compares it with other kernels such as the scalar wavelet kernel, the Gaussian kernel, and the exponential radial basis function kernel. Finally, Section 5 draws conclusions and discusses future work that could be performed.

Section snippets

Review of SVM methods

We briefly review the SVM for function regression and pattern recognition in this section. As noted above, this is a well-established technique that appears to have good performance.

Auto-correlation of compactly supported wavelet as SVM kernel

The wavelet transform provides a time–frequency representation of a signal. The dyadic wavelet transform has attracted a lot of attention in the signal processing community. In a dyadic wavelet transform, any finite energy signal is represented in terms of dilates and translates of a signal function called a wavelet. Wavelets satisfy a multiresolution analysis and they obey the following relations:ϕ(x)=2k=0L-1hkϕ(2x-k)andψ(x)=2k=0L-1gkϕ(2x-k),where gk=(-1)khL-k-1, k=0,,L-1.

By the definition

Applications of auto-correlation wavelet SVM

In this section, we apply the auto-correlation wavelet SVM to two important applications: function regression and pattern recognition.

Conclusions and future work

In this paper, we propose to use the auto-correlation of compactly supported wavelet as a kernel for SVM and apply it to signal regression and pattern recognition. The kernel is an admissible support vector kernel, and it can be used to approximate arbitrary functions of any dimensions.

It is not surprising that the auto-correlation wavelet kernel is better than or comparable to other kinds of kernels. Experiments for signal regression and pattern recognition show that this kernel is a feasible

Acknowledgements

The authors thank the anonymous reviewers whose constructive ideas and suggestions have improved the quality of the paper. This work was supported by the postdoctoral fellowship from the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Space Agency Postdoctoral Fellowship Supplement. The modified version of LIBSVM (a library for support vector machines) tool has been used in this paper for classification available at http://www.csie.ntu.edu.tw/cjlin/libsvm.

References (21)

  • G.Y. Chen et al.

    Contour-based handwritten numeral recognition using multiwavelets and neural networks

    Pattern Recognition

    (2003)
  • A. Smola et al.

    The connection between regulation operators and support vector kernels

    Neural Network

    (1998)
  • V.N. Vapnik

    The Nature of Statistical Learning

    (1995)
  • V.N. Vapnik

    Statistical Learning Theory

    (1998)
  • C. Cortes et al.

    Support vector networks

    Machine Learning

    (1995)
  • G.Y. Chen, W.F. Xie, Multiwavelet support vector machines, in: Proceedings of Image and Vision Computing, Dunedin, New...
  • L. Zhang et al.

    Wavelet support vector machine

    IEEE Transactions on Systems, Man, and Cybernetics – Part B

    (2004)
  • G.Y. Chen, G. Dudek, Auto-correlation wavelet support vector machine and its applications to regression, in:...
  • C.K. Chui

    An Introduction to Wavelets

    (1992)
  • I. Daubechies

    Orthonormal bases of compactly supported wavelets

    Communications on Pure and Applied Mathematics

    (1988)
There are more references available in the full text version of this article.

Cited by (15)

  • A novel fault diagnosis model for gearbox based on wavelet support vector machine with immune genetic algorithm

    2013, Measurement: Journal of the International Measurement Confederation
    Citation Excerpt :

    There are many kinds of existent support vector kernels such as the Gaussian kernel, the polynomial kernel, the exponential radial basis function kernel, wavelet kernel and so on. Since wavelet kernel can fitting arbitrary function within high-dimensional feature space, making the support vector machine to obtain the separating hyper-plane is more reasonable [6–8]. Therefore, SVM combine with wavelet theory to construct a multi-dimensional wavelet kernel function, i.e. WSVM to identify the fault pattern of a gearbox is applied in this research.

  • Invariant pattern recognition using contourlets and AdaBoost

    2010, Pattern Recognition
    Citation Excerpt :

    This descriptor gives higher recognition rate than the one given in [3] for handwritten numeral recognition. Chen and Dudek [5] used the autocorrelation wavelet support vector machine for handwritten numeral recognition with a recognition rate of 95.65%. More recently, Chen et al. proposed two invariant descriptors for character recognition and aircraft recognition by using ridgelets in [6,7].

  • Sparse support vector machine for pattern recognition

    2016, Concurrency and Computation: Practice and Experience
  • Small bowel image classification using dual tree complex wavelet-based cross co-occurrence features and canonical discriminant analysis

    2015, 2015 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2015
View all citing articles on Scopus
View full text