Skip to main content

STOCS: An Efficient Self-Tuning Multiclass Classification Approach

  • Conference paper
  • First Online:
Advances in Artificial Intelligence (Canadian AI 2015)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9091))

Included in the following conference series:

Abstract

A simple, efficient, and parameter-free approach is proposed for the problem of multiclass classification, and is especially useful when dealing with large-scale datasets in the presence of label noise. Grown out of one-class SVM, our approach enjoys several distinct features: First, its decision boundary is learned based on both positive and negative examples; Second, the internal parameters and especially the kernel bandwidth are self-tuned. Our approach is compared side-by-side with LIBSVM, arguably the most widely-used multiclass classification system, in a sequence of empirical evaluations, where our approach is shown to perform almost as well as their optimal parameter settings tuned for individual datasets, while consuming only a fraction of the processing time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Angluin, D., Laird, P.: Learning from noisy examples. Mach. Learn. 2(4), 343–370 (1988)

    Google Scholar 

  2. Bottou, L., Bousquet, O.: The tradeoffs of large scale learning. In: NIPS (2008)

    Google Scholar 

  3. Bylander, T.: Learning linear threshold functions in the presence of classification noise. In: COLT (1994)

    Google Scholar 

  4. Cesa-Bianchi, N., Shalev-Shwartz, S., Shamir, O.: Online learning of noisy data. IEEE Transactions on Information Theory 57(12), 7907–7931 (2011)

    Article  MathSciNet  Google Scholar 

  5. Chang, C., Lin, C.: LIBSVM: A library for svms. ACM Trans. on Intelligent Systems and Technology 2, 27:1–27:27 (2011)

    Article  Google Scholar 

  6. Cheng, L., Gong, M., Schuurmans, D., Caelli, T.: Real-time discriminative background subtraction. In: TIP (2011)

    Google Scholar 

  7. Cheng, L., Vishwanathan, S., Schuurmans, D., Wang, S., Caelli, T.: Implicit online learning with kernels. In: NIPS (2007)

    Google Scholar 

  8. Crammer, K., Singer, Y., Cristianini, N., Shawe-Taylor, J., Williamson, B.: On the algorithmic implementation of multiclass kernel-based vector machines. JMLR 2, 2001 (2001)

    Google Scholar 

  9. Craven, P., Wahba, G.: Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math. 31, 377–403 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  10. Dai, J., Yan, S., Tang, X., Kwok, J.: Locally adaptive classification piloted by uncertainty. In: ICML (2006)

    Google Scholar 

  11. Dietterich, T., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. JAIR 2, 263–286 (1995)

    MATH  Google Scholar 

  12. Elkan, C., Noto, K.: Learning classifiers from only positive and unlabeled data. In: KDD (2008)

    Google Scholar 

  13. Frenay, B., Verleysen, M.: Classification in the presence of label noise: a survey. IEEE Trans. NNLS (2014)

    Google Scholar 

  14. Gao, T., Koller, D.: Multiclass boosting with hinge loss based on output coding. In: ICML (2011)

    Google Scholar 

  15. Garcia-Pedrajas, N., Haro-Garcia, A.: Scaling up data mining algorithms: review and taxonomy. Progress in Artificial Intelligence, 71–87 (2012)

    Google Scholar 

  16. Gold, C., Holub, A., Sollich, P.: Bayesian approach to feature selection and parameter tuning for support vector machine classifiers. Neural Networks 18(5–6), 693–701 (2005)

    Article  MATH  Google Scholar 

  17. Gong, M., Qian, Y., Cheng, L.: Integrated foreground segmentation and boundary matting for live videos. IEEE Trans. Image Processing (TIP) (2015)

    Google Scholar 

  18. Gong, M., Cheng, L.: Foreground segmentation of live videos using locally competing 1svms. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2105–2112. IEEE (2011)

    Google Scholar 

  19. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer (2009)

    Google Scholar 

  20. Herrmann, E.: Local bandwidth choice in kernel regression estimation. J. of Computational and Graphical Statistics 6(1), 35–54 (1997)

    Google Scholar 

  21. Izbicki, M.: Algebraic classifiers: a generic approach to fast cross-validation, online training, and parallel training. In: ICML (2013)

    Google Scholar 

  22. Khardon, R., Wachman, G.: Noise tolerant variants of the perceptron algorithm. JMLR 8, 227–248 (2007)

    MATH  Google Scholar 

  23. Kim, J., Scott, C.: Variable kernel density estimation. Ann. of Statistics 20, 1236–1265 (1992)

    Article  Google Scholar 

  24. Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection. In: IJCAI (1995)

    Google Scholar 

  25. Lawrence, N., Scholkopf, B.: Estimating a kernel fisher discriminant in the presence of label noise. In: ICML (2001)

    Google Scholar 

  26. Lorbert, A., Ramadge, P.: Descent methods for tuning parameter refinement. In: AISTATS (2010)

    Google Scholar 

  27. Natarajan, N., Dhillon, I., Ravikumar, P., Tewari, A.: Learning with noisy labels. In: NIPS (2013)

    Google Scholar 

  28. Nettleton, D., Orriols-Puig, A., Fornells, A.: A study of the effect of different types of noise on the precision of supervised learning techniques. Artificial Intelligence Review 33, 275–306 (2010)

    Article  Google Scholar 

  29. Poggio, T., Cauwenberghs, G.: Incremental and decremental support vector machine learning. In: Advances in Neural Information Processing Systems 13: Proceedings of the 2000 Conference, vol. 13, p. 409. MIT Press (2001)

    Google Scholar 

  30. Rifkin, R., Klautau, A.: In defense of one-vs-all classification. JMLR 5, 101–141 (2004)

    MATH  MathSciNet  Google Scholar 

  31. Scott, C., Blanchard, G., Handy, G.: Classification with asymmetric label noise: Consistency and maximal denoising. In: COLT (2013)

    Google Scholar 

  32. Stone, M.: Cross-validatory choice and assessment of statistical predictions. J. of Roy. Stat. Soc. (JRSS) Series B 36, 111–147 (1974)

    MATH  Google Scholar 

  33. Sun, W., Wang, J., Fang, Y.: Consistent selection of tuning parameters via variable selection stability. JMLR 14, 3419–3440 (2013)

    MATH  MathSciNet  Google Scholar 

  34. Wang, W., Gelman, A.: A problem with the use of cross-validation for selecting among multilevel models. Tech. rep., Columbia Univ., New York (2013)

    Google Scholar 

  35. Zhang, T.: Statistical analysis of some multi-category large margin classification methods. The Journal of Machine Learning Research 5, 1225–1251 (2004)

    MATH  Google Scholar 

  36. Zhu, X., Wu, X.: Class noise vs. attribute noise: A quantitative study of their impacts. Artif. Intell. Rev. 22(3), 177–210 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  37. Zou, H.: The adaptive lasso and its oracle properties. J. of the American Statistical Association 101, 476 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Minglun Gong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Qian, Y., Gong, M., Cheng, L. (2015). STOCS: An Efficient Self-Tuning Multiclass Classification Approach. In: Barbosa, D., Milios, E. (eds) Advances in Artificial Intelligence. Canadian AI 2015. Lecture Notes in Computer Science(), vol 9091. Springer, Cham. https://doi.org/10.1007/978-3-319-18356-5_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-18356-5_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-18355-8

  • Online ISBN: 978-3-319-18356-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics