Skip to main content

A Max-Margin Learning Algorithm with Additional Features

  • Conference paper
Frontiers in Algorithmics (FAW 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5598))

Included in the following conference series:

  • 988 Accesses

Abstract

This paper investigates the problem of learning classifiers from samples which have additional features and some of these additional features are absent due to noise or corruption of measurement. The common approach for handling missing features in discriminative models is to complete their unknown values with some methods firstly, and then use a standard classification procedure on the completed data. In this paper, an incremental Max-Margin Learning Algorithm is proposed to tackle with data which have additional features and some of these features are missing. We show how to use a max-margin learning framework to classify the incomplete data directly without any completing of the missing features. Based on the geometric interpretation of the margin, we formulate an objective function which aims to maximize the margin of each sample in its own relevant subspace. In this formulation, we make use of the structural parameters trained from existing features and optimize the structural parameters trained from additional features only. A two-step iterative procedure for solving the objective function is proposed. By avoiding the pre-processing phase in which the data is completed, our algorithm could offer considerable computational saving. Moreover, by using structural parameters trained from existing features and training the additional absent features only, our algorithm can save much training time. We demonstrate our results on a large number of standard benchmarks from UCI and the results show that our algorithm can achieve better or comparable classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Little, R.J.A., Rubin, D.B.: Statistical Analysis with Missing Data. Wiley, New York (1987)

    MATH  Google Scholar 

  2. Roth, P.: Missing data: A conceptual review for applied psychologists. Personnel Psychology 47(3), 537–560 (1994)

    Article  Google Scholar 

  3. Ghahramani, Z., Jordan, M.I.: Supervised learning from incomplete data via an EM approach. In: Cowan, J.D., Tesauro, G., Alspector, J. (eds.) Advances in Neural Information Processing Systems, vol. 6, pp. 120–127. Morgan Kaufmann Publishers, Inc., San Francisco (1994)

    Google Scholar 

  4. Kapoor, A.: Learning Discriminative Models with Incomplete Data. PhD thesis, MIT Media Lab (Feburary 2006)

    Google Scholar 

  5. Chechik, G., Heitz, G., Elidan, G., Abbeel, P., Koller, D.: Max-margin Classification of Data with Absent Features. Journal of Machine Learning Research 9, 1–21 (2008)

    MATH  Google Scholar 

  6. Ong, C.S., Smola, A.J., Williamson, R.C.: Learning the Kernel with Hyperkernels. Journal of Machine Learning Research 6, 1043–1071 (2005)

    MathSciNet  MATH  Google Scholar 

  7. Crammer, K., Keshet, J., Singer, Y.: Kernel design using boosting. In: Advances in Neural Information Processing Systems, vol. 15, pp. 537–544 (2002)

    Google Scholar 

  8. Ong, C.S., Smola, A.J.: Machine learning using hyperkernels. In: Proceedings of the International Conference on Machine Learning, pp. 568–575 (2003)

    Google Scholar 

  9. Liu, X., Zhang, G., Zhan, Y., Zhu, E.: An Incremental Feature Learning Algorithm Based on Least Square Support Vector Machine. In: Preparata, F.P., Wu, X., Yin, J. (eds.) FAW 2008. LNCS, vol. 5059, pp. 330–338. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  10. Kazushi, I., Takemasa, Y.: Incremental support vector machines and their geometrical analyses. Neuro-computing, 2528–2533 (2007)

    Google Scholar 

  11. Wang, L., Yang, C., Feng, J.: On learning with dissimilarity functions. In: Proceedings of the 24th international conference on machine learning, pp. 991–998 (2007)

    Google Scholar 

  12. Dick, U., Haider, P., Scheffer, T.: Learning from Incomplete Data with Infinite Imputations. In: Proceedings of the 25th international conference on machine learning, pp. 232–239 (2008)

    Google Scholar 

  13. Williams, D., Carin, L.: Analytical kernel ma-trix completion with incomplete multi-view data. In: Proceedings of the ICML Workshop on Learning With Multiple Views (2005)

    Google Scholar 

  14. Williams, D., Liao, X., Xue, Y., Carin, L.: Incomplete-data classification using logistic regression. In: Proceedings of the 22nd International Conference on Machine learning (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liu, X., Yin, J., Zhu, E., Zhan, Y., Li, M., Zhang, C. (2009). A Max-Margin Learning Algorithm with Additional Features. In: Deng, X., Hopcroft, J.E., Xue, J. (eds) Frontiers in Algorithmics. FAW 2009. Lecture Notes in Computer Science, vol 5598. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02270-8_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02270-8_21

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02269-2

  • Online ISBN: 978-3-642-02270-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics