Skip to main content

Non-monotonic Feature Selection for Regression

  • Conference paper
Neural Information Processing (ICONIP 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8835))

Included in the following conference series:

Abstract

Feature selection is an important research problem in machine learning and data mining. It is usually constrained by the budget of the feature subset size in practical applications. When the budget changes, the ranks of features in the selected feature subsets may also change due to nonlinear cost functions for acquisition of features. This property is called non-monotonic feature selection. In this paper, we focus on non-monotonic selection of features for regression tasks and approximate the original combinatorial optimization problem by a Multiple Kernel Learning (MKL) problem and show the performance guarantee for the derived solution when compared to the global optimal solution for the combinatorial optimization problem. We conduct detailed experiments to demonstrate the effectiveness of the proposed method. The empirical results indicate the promising performance of the proposed framework compared with several state-of-the-art approaches for feature selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bach, F.R., Lanckriet, G.R.G., Jordan, M.I.: Multiple kernel learning, conic duality, and the SMO algorithm. In: ICML, pp. 41–48. ACM, New York (2004)

    Google Scholar 

  2. Bi, J., Bennett, K.P., Embrechts, M.J., Breneman, C.M., Song, M.: Dimensionality reduction via sparse support vector machines. J. Mach. Learn. Res. 3, 1229–1243 (2003)

    MATH  Google Scholar 

  3. Draper, N., Smith, H.: Applied Regression Analysis, 3rd edn. Wiley Interscience (1998)

    Google Scholar 

  4. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    MATH  Google Scholar 

  5. Harrison, D.J., Rubinfeld, D.L.: Hedonic housing prices and the demand for clean air. Journal of Environmental Economics and Management 5(1), 81–102 (1978)

    Article  MATH  Google Scholar 

  6. Lanckriet, G.R.G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the kernel matrix with semidefinite programming. J. Mach. Learn. Res. 5, 27–72 (2004)

    MATH  Google Scholar 

  7. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

  8. Tan, M., Wang, L., Tsang, I.W.: Learning sparse svm for feature selection on very high dimensional datasets. In: ICML, pp. 1047–1054 (2010)

    Google Scholar 

  9. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  10. Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: The emergence of sparsity in a weight-based approach. J. Mach. Learn. Res. 6, 1855–1887 (2005)

    MathSciNet  MATH  Google Scholar 

  11. Xu, Z., Jin, R., King, I., Lyu, M.: An extended level method for efficient multiple kernel learning. In: NIPS, pp. 1825–1832 (2009)

    Google Scholar 

  12. Xu, Z., Jin, R., Ye, J., Lyu, M.R., King, I.: Non-monotonic feature selection. In: ICML, pp. 1145–1152 (2009)

    Google Scholar 

  13. Xu, Z., Jin, R., Zhu, S., Lyu, M.R., King, I.: Smooth optimization for effective multiple kernel learning. In: AAAI (2010)

    Google Scholar 

  14. Xu, Z., King, I., Lyu, M.R., Jin, R.: Discriminative semi-supervised feature selection via manifold regularization. IEEE Transactions on Neural Networks 21(7), 1033–1047 (2010)

    Article  Google Scholar 

  15. Yang, H., Chan, L., King, I.: Support vector machine regression for volatile stock market prediction. In: Yin, H., Allinson, N.M., Freeman, R., Keane, J.A., Hubbard, S. (eds.) IDEAL 2002. LNCS, vol. 2412, pp. 391–396. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  16. Yang, H., Huang, K., King, I., Lyu, M.R.: Localized support vector regression for time series prediction. Neurocomputing 72, 2659–2669 (2009)

    Article  Google Scholar 

  17. Yang, H., Xu, Z., Ye, J., King, I., Lyu, M.R.: Efficient sparse generalized multiple kernel learning. IEEE Transactions on Neural Networks 22(3), 433–446 (2011)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Yang, H., Xu, Z., King, I., Lyu, M.R. (2014). Non-monotonic Feature Selection for Regression. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12640-1_6

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12639-5

  • Online ISBN: 978-3-319-12640-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics