Skip to main content

Rough Margin Based Core Vector Machine

  • Conference paper
Advances in Knowledge Discovery and Data Mining (PAKDD 2010)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6118))

Included in the following conference series:

  • 4116 Accesses

Abstract

The recently proposed rough margin based support vector machine (RMSVM) could tackle the overfitting problem due to outliers effectively with the help of rough margins. However, the standard solvers for them are time consuming and not feasible for large datasets. On the other hand, the core vector machine (CVM) is an optimization technique based on the minimum enclosing ball that can scale up an SVM to handle very large datasets. While the 2-norm error used in the CVM might make it theoretically less robust against outliers, the rough margin could make up this deficiency. Therefore we propose our rough margin based core vector machine algorithms. Experimental results show that our algorithms hold the generalization performance almost as good as the RMSVM on large scale datasets and improve the accuracy of the CVM significantly on extremely noisy datasets, whilst cost much less computational resources and are often faster than the CVM.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)

    MATH  Google Scholar 

  2. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods, pp. 185–208. MIT Press, Cambridge (1999)

    Google Scholar 

  3. Smola, A.J., Schökopf, B.: Sparse greedy matrix approximation for machine learning. In: 17th ICML, pp. 911–918 (2000)

    Google Scholar 

  4. Tsang, I.W., Kwok, J.T., Cheung, P.M.: Very large svm training using core vector machines. In: 20th AISTATS (2005)

    Google Scholar 

  5. Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core vector machines: Fast svm training on very large data sets. JMLR 6, 363–392 (2005)

    MathSciNet  Google Scholar 

  6. Tsang, I.W., Kocsor, A., Kwok, J.T.: Simpler core vector machines with enclosing balls. In: 24th ICML, pp. 911–918 (2007)

    Google Scholar 

  7. Asharaf, S., Murty, M.N., Shevade, S.K.: Multiclass core vector machine. In: 24th ICML, pp. 41–48 (2007)

    Google Scholar 

  8. Zhang, J., Wang, Y.: A rough margin based support vector machine. Information Sciences 178, 2204–2214 (2008)

    Article  MathSciNet  Google Scholar 

  9. Bădoiu, M., Clarkson, K.L.: Optimal core-sets for balls. In: DIMACS Workshop on Computational Geometry (2002)

    Google Scholar 

  10. Tsang, I.W., Kwok, J.T., Lai, K.T.: Core vector regression for very large regression problems. In: 22nd ICML, pp. 912–919 (2005)

    Google Scholar 

  11. Tsang, I.W., Kwok, J.T., Zurada, J.M.: Generalized core vector machines. IEEE Transactions on Neural Networks 17, 1126–1140 (2006)

    Article  Google Scholar 

  12. Pawlak, Z.: Rough sets. International Journal of Computer and Information Sciences 11, 341–356 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  13. Weston, J., Collobert, R., Sinz, F., Bottou, L., Vapnik, V.: Inference with the universum. In: 23rd ICML, pp. 1009–1016 (2006)

    Google Scholar 

  14. Crisp, D.J., Burges, C.J.C.: A geometric interpretation of ν-svm classifiers. In: NIPS, vol. 12, pp. 244–250 (1999)

    Google Scholar 

  15. Chang, C.C., Lin, C.J.: Training ν-support vector classifiers: Theory and algorithms. Neural Computation 13, 2119–2147 (2001)

    Article  MATH  Google Scholar 

  16. Cortes, C., Vapnik, V.: Support-vector networks. Machine Learning 20, 273–297 (1995)

    MATH  Google Scholar 

  17. Schölkopf, B., Smola, A.J., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Computation 12, 1207–1245 (2000)

    Article  Google Scholar 

  18. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines (2001), http://www.csie.ntu.edu.tw/~cjlin/libsvm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Niu, G., Dai, B., Shang, L., Ji, Y. (2010). Rough Margin Based Core Vector Machine. In: Zaki, M.J., Yu, J.X., Ravindran, B., Pudi, V. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2010. Lecture Notes in Computer Science(), vol 6118. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13657-3_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13657-3_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13656-6

  • Online ISBN: 978-3-642-13657-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics