Skip to main content

Factorizing Class Characteristics via Group MEBs Construction

  • Conference paper
Neural Information Processing. Models and Applications (ICONIP 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6444))

Included in the following conference series:

  • 2607 Accesses

Abstract

Classic MEB (minimum enclosing ball) models characteristics of each class for classification by extracting core vectors through a (1 + ε)-approximation problem solving. In this paper, we develop a new MEB system learning the core vectors set in a group manner, called group MEB (g-MEB). The g-MEB factorizes class characteristic in 3 aspects such as, reducing the sparseness in MEB by decomposing data space based on data distribution density, discriminating core vectors on class interaction hyperplanes, and enabling outliers detection to decrease noise affection. Experimental results show that the factorized core set from g-MEB delivers often apparently higher classification accuracies than the classic MEB.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ben-Hur, A., Horn, D., Siegelmann, H.T., Vapnik, V.: Support vector clustering. J. Mach. Learn. Res. 2, 125–137 (2002)

    MATH  Google Scholar 

  2. Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  3. Tsang, I.W., Kwok, J.T., Cheung, P.M.: Core vector machines: Fast svm training on very large data sets. Journal of Machine Learning Research 6, 363–392 (2005)

    MathSciNet  MATH  Google Scholar 

  4. Bādoiu, M., Har-Peled, S., Indyk, P.: Approximate clustering via core-sets. In: STOC 2002: Proceedings of the Thiry-Fourth Annual ACM Symposium on Theory of Computing, pp. 250–257. ACM, New York (2002)

    Chapter  Google Scholar 

  5. Welzl, E.: Smallest enclosing disks (balls and ellipsoids). In: Maurer, H.A. (ed.) New Results and New Trends in Computer Science. LNCS, vol. 555, pp. 359–370. Springer, Heidelberg (1991)

    Chapter  Google Scholar 

  6. Pang, S., Kim, D., Bang, S.Y.: Face membership authentication using svm classification tree generated by membership-based lle data partition. IEEE Transactions on Neural Networks 16(2), 436–446 (2005)

    Article  Google Scholar 

  7. Garcia, V., Alejo, R., Sinchez, J.S., Sotoca, J.M., Mollineda, R.A.: Combined effects of class imbalance and class overlap on instance-based classification (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, Y., Pang, S., Kasabov, N. (2010). Factorizing Class Characteristics via Group MEBs Construction. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Models and Applications. ICONIP 2010. Lecture Notes in Computer Science, vol 6444. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17534-3_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-17534-3_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-17533-6

  • Online ISBN: 978-3-642-17534-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics