Abstract
This paper presents a Multi-Classification Schema (MCS) which combines Weighted SVMs (WSVM) and Spectrum-based kNN (SkNN). Basic SVM is equipped with belief coefficients to reveal its capacity in identifying classes. And basic SVM is built in individual feature space to bring adaptation to diverse training data context. Coupled with a weighted voting strategy and a local informative metric, SkNN is used to address the case rejected by all basic classifiers. The local metric is derived from most discriminant directions carried by data spectrum information. Two strategies of MCS benefit computational cost: training dataset reduction, and pre-specification of SkNN working set. Experiments on real datasets show MCS improves classification accuracy with moderate cost compared with the state of the art.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, London (2000)
Hastie, T.J., Tibshirani, R.J.: Classification by Pairwise Coupling. In: Jordan, M.I., Kearns, M.J., Solla, S.A. (eds.) Advances in Neural Information Processing Systems, vol. 10, pp. 507–513. MIT Press, Cambridge (1998)
Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)
Platt, J., Cristianini, N., Shawe-Taylor, J.: Large Margin DAGs for Multi-Class Classification. In: Advances in Neural Information Processing Systems, vol. 12, pp. 547–553. MIT Press, Cambridge (2000)
Crammer, K., Singer, Y.: On the Learn Ability and Design of Output Codes for Multi-Class Problems. Machine Learning 47, 201–233 (2002)
Bottou, L., Vapnik, V.: Local Learning Algorithm. Neural Computation 4, 888–900 (1992)
Ben-Hur, A., Horn, D., Siegelmann, H.T.: Support Vector Clustering. Journal of Machine Learning Research 2, 125–137 (2001)
Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipies in C, 2nd edn. Cambridge Univ. Press, Cambridge (1992)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)
Friedman, J.H.: Flexible Metric Nearest Neighbor Classification. Tech. Report, Dept. of Statistics, Stanford University (1994)
Hastie, T., Tibshirani, R.: Discriminant Adaptive Nearest Neighbor Classification. IEEE Trans. on Pattern Analysis and Machine Intelligence. 18(6), 607–615 (1996)
Domeniconi, C., Peng, J., Gunopulos, D.: An Adaptive Metric Machine for Pattern Classification. In: Advances in Neural Information Processing Systems, vol. 13, MIT Press, Cambridge (2000)
http://www.cs.cmu.edu/afs/cs/project/theo-11/www/naive-bayes.html
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Ping, L., Nan, L., Jian-yu, W., Chun-Guang, Z. (2007). Combining Weighted SVMs and Spectrum-Based kNN for Multi-classification. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72395-0_57
Download citation
DOI: https://doi.org/10.1007/978-3-540-72395-0_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72394-3
Online ISBN: 978-3-540-72395-0
eBook Packages: Computer ScienceComputer Science (R0)