Abstract.
Support Vector Machines (SVMs) are now very popular as a powerful method in pattern classification problems. One of main features of SVMs is to produce a separating hyperplane which maximizes the margin in feature space induced by nonlinear mapping using kernel function. As a result, SVMs can treat not only linear separation but also nonlinear separation. While the soft margin method of SVMs considers only the distance between separating hyperplane and misclassified data, we propose in this paper multi-objective programming formulation considering surplus variables. A similar formulation was extensively researched in linear discriminant analysis mostly in 1980s by using Goal Programming(GP). This paper compares these conventional methods such as SVMs and GP with our proposed formulation through several examples.
Similar content being viewed by others
Author information
Authors and Affiliations
Corresponding author
Additional information
Received: September 2003, Revised: December 2003,
Rights and permissions
About this article
Cite this article
Asada, T., Yun, Y., Nakayama, H. et al. Pattern classification by goal programming and support vector machines. Computational Management Science 1, 211–230 (2004). https://doi.org/10.1007/s10287-004-0013-x
Issue Date:
DOI: https://doi.org/10.1007/s10287-004-0013-x