Elsevier

Neurocomputing

Volume 215, 26 November 2016, Pages 212-216
Neurocomputing

Linear programming ν-nonparallel support vector machine and its application in vehicle recognition

https://doi.org/10.1016/j.neucom.2015.07.159Get rights and content

Abstract

In this paper, based on the nonparallel hyperplane classifier, ν-nonparallel support vector machine (ν-NPSVM), we proposed its linear programming formulation, termed as ν-LPNPSVM. ν-NPSVM which has been proved superior to the twin support vector machines (TWSVMs), is parameterized by the quantity ν to let ones effectively control the number of support vectors. Compared with the quadratic programming problem of ν-NPSVM, the 1-norm regularization term is introduced to ν-LPNPSVM to make it to be linear programming problem which can be solved fastly and easily. We also introduce kernel functions directly into the formulation for the nonlinear case. The numerical experiments on lots of data sets verify that our ν-LPNPSVM is superior to TWSVMs and faster than standard NPSVMs. We also apply this new method to the vehicle recognition problem and justify its efficiency.

Introduction

Support vector machines (SVMs) [1], [2], [3], [4], [5] have been successfully applied in lots of fields [6], [7], [8], [9], [10], [11], [12]. Either for classification or regression, SVMs represent the decision boundary in terms of a typically small subset of all training points, called the support vectors (SVs). For the standard binary support vector classification (SVC), the basic idea is to find the optimal separating hyperplane between the positive and negative data points. The optimal hyperplane is obtained by maximizing the margin between two parallel hyperplanes, which is implemented by minimizing a corresponding a quadratic programming problem (QPP). By introducing the kernel trick, SVC can also solve nonlinear classification problem successfully.

Recently, nonparallel hyperplane SVM, including the generalized eigenvalue proximal support vector machine (GEPSVM) [13] and the twin support vector machine (TWSVM) [14], is developed and attracted many interests. For TWSVM, it seeks two nonparallel proximal hyperplanes such that each hyperplane is closer to one of the two classes and is at least one distance from the other. It is implemented by solving two smaller QPPs instead of a larger one, which increases the TWSVM training speed by approximately four-fold compared to that of standard SVC. TWSVMs have been studied extensively [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28]. The nonparallel support vector machine (NPSVM) [26], [27], as the improved TWSVM, is superior theoretically and overcomes several drawbacks of the existing TWSVMs. The ν version of NPSVM, ν-NPSVM [29], is parameterized by the quantity ν to let ones effectively control the number of support vectors. By combining the ν-support vector classification (ν-SVC) and the ν-support vector regression (ν-SVR) together to construct the primal problems, ν-NPSVM inherits the advantages of ν-SVM so that enables us to eliminate one of the other free parameters of the NPSVM: the accuracy parameter ε and the regularization constant C.

In the above TWSVMs or NPSVMs, qudractic programming problems need to be solved. However, it is also possible to formulate classification problems in linear programming, replacing the quadratic objective function by a linear function. In this paper, we proposed a linear programming ν-NPSVM . This algorithm can be considered as an improved version of ν-NPSVM. Their main difference is that our algorithm depends on solving linear programming instead of quadratic programming. Experimental results show that the speed of our algorithm is about several times faster than ν-NPSVM on the premise of keeping the classification precision.

The paper is organized as follows. Section 2 briefly dwells on the initial TWSVM, NPSVM, and its ν version, ν-NPSVM. Section 3 proposes the ν-NPSVM in linear programming. Section 4 deals with experimental results and a real application. Section 5 contains concluding remarks.

Section snippets

Background

In this section, we briefly introduce the basic TWSVM, the standard NPSVM and the ν-NPSVM.

Linear case

From ν-NPSVM, we know that the solutions of the primal problems with respect to w+ or w can be expressed asw+=i=1p+qαixi,w=i=1p+qβixi,where α and β are the solutions of the dual problems. We apply above Eq. (9) to take the 1-norm terms α1 and β1 instead of the quadratic terms w+2 and w2 respectively, furthermore, by introducing the constraints uiαiui and siβisi, we finally get two linear programming problems (LPPs)min12i=1p+qui+C1(ν1ε++1pi=1p(ηi+ηi))+(ν2ρ++1qj=p+1p+qξj),

Numerical experiments and application

In this section, we perform the numerical experiments on the UCI datasets, and also apply this new method to the vehicle recognition problem.

Conclusion

In this paper, we proposed a linear programming formulated ν-NPSVM, termed as ν-LPNPSVM. Experiments and real application to vehicle recognition have shown that our algorithm is considerably faster, usually over several times, than ν-NPSVM while the same level of accuracy are kept. Therefore, it is suitable for solving large-scale data sets. Future research includes extensions to multi-class classification, regression and other

Acknowledgments

This work was joint-supported by the Research Project for Outstanding Scholars of Beijing Municipal Commission of Transport (No. kj2013-2-34), and the Special Fund Basic Scientific Research of Central Universities and Colleges (No. 2013JBM055).

Guangyu Zhu, born in 1972, an Associate Professor of Beijing Jiaotong University. His research interests focus on intelligent transportation analysis.

References (29)

  • C. Cortes et al.

    Support-vector networks

    Mach. Learn.

    (1995)
  • V.N. Vapnik

    The Nature of Statistical Learning Theory

    (1996)
  • V.N. Vapnik

    Statistical Learning Theory

    (1998)
  • C. Burges

    A tutorial on support vector machines for pattern recognition

    Data Min. Knowl. Discov.

    (1998)
  • Cited by (4)

    • Wavelet kernel twin support vector machine

      2021, Journal of Information Hiding and Multimedia Signal Processing
    • Vehicle Identification Method based on Vehicle Combination Characteristics

      2019, Proceedings of 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference, IMCEC 2019

    Guangyu Zhu, born in 1972, an Associate Professor of Beijing Jiaotong University. His research interests focus on intelligent transportation analysis.

    View full text