Skip to main content

Sparsity Preserving Score for Joint Feature Selection

  • Conference paper
  • 2381 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 8261))

Abstract

Based on recent advances in sparse representation technique, we propose in this paper Sparsity Preserving Score (SPS) to jointly select features. SPS evaluates the importance of a feature by its power of sparse reconstructive relationship preserving, which is achieved by minimizing an objective function with l 1-norm regularization and binary constrain. Our searching strategy, which is an essentially discrete optimization, jointly selects features by projecting the original high-dimensional data to a low-dimensional space through a special binary projection matrix. Theoretical analysis guarantees our objective function can get a closed form solution, which is as simple as scoring each feature by Frobenius norm of sparse linear reconstruction residual for each feature. Comparing experiments on two face datasets are carried out. The experimental results demonstrate the effectiveness and efficiency of our algorithm.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Guyon, I., Elisseeff, A.: An Introduction to Variable and Feature Selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    MATH  Google Scholar 

  2. Gu, Q.Q., Li, Z.H., Han, J.W.: Joint Feature Selection and Subspace Learning. In: International Joint Conference on Artificial Intelligence, Barcelona, Spain (2011)

    Google Scholar 

  3. Nie, F.P., Huang, H., Cai, X., Ding, C.: Efficient and Robust Feature Selection via Joint l 2,1-Norms Minimization. In: Advances in Neural Information Processing Systems, Vancouver, BC, Canada, pp. 1813–1182 (2010)

    Google Scholar 

  4. Ma, Z.G., Nie, F.P., Yang, Y., Uijlings, J.R.R., Sebe, N.: Web Image Annotation Via Subspace-Sparsity Collaborated Feature Selection. IEEE Transaction on Multimedia 14(4), 1021–1030 (2012)

    Article  Google Scholar 

  5. Yan, H., Yuan, X.T., Yan, S.C., Yang, J.Y.: Correntropy based Feature Selection using Binary Projection. Pattern Recognition 44, 2834–2842 (2011)

    Article  MATH  Google Scholar 

  6. Duda, R., Hart, P., Stork, D.: Pattern Classification. John Wiley Sons, NewYork (2001)

    MATH  Google Scholar 

  7. Wright, J., Yang, A., Ganesh, A., Sastry, S., Ma, Y.: Robust Face Recognition via Sparse Representation. IEEE Journal on Pattern Analysis and Machine Intelligence 31, 210–227 (2009)

    Article  Google Scholar 

  8. Kim, S.J., Koh, K., Lustig, M., Boyd, S., Gorinevsky, D.: A Method for Largescale l 1-regularized Least Squares. IEEE Journal on Selected Topics in Signal Processing 1(4), 606–617 (2007)

    Article  Google Scholar 

  9. Kohavi, R., John, G.H.: Wrappers for Feature Subset Selection. Artificial Intelligence 92(12), 273–324 (1997)

    Article  Google Scholar 

  10. Devijver, P.A.: Pattern Recognition: A Statistical Approach. Prentice-Hall (1982)

    Google Scholar 

  11. Jain, A., Zongker, D.: Feature Selection: Evaluation, Application, and Small sample performance. IEEE Journal on Pattern Analysis and Machine Intelligence 19, 153–158 (1997)

    Article  Google Scholar 

  12. Qiao, L.S., Chen, S.C., Tan, X.Y.: Sparsity Preserving Projections with Applications to Face Recognition. Pattern Recognition 43(1), 331–341 (2010)

    Article  MATH  Google Scholar 

  13. Clemmensen, L., Hastie, T., Witten, D., Ersboll, B.: Sparse Discriminant Analysis. Technometrics 53(4), 406–413 (2011)

    Article  MathSciNet  Google Scholar 

  14. Zhang, L.M., Chen, S., Qiao, L.: Graph Optimization for Dimensionality Reduction with Sparsity Constraints. Pattern Recognition 45(3), 1205–1210 (2012)

    Article  MATH  Google Scholar 

  15. He, X., Cai, D., Niyogi, P.: Laplacian Score for Feature Selection. In: Advances in Neural Information Processing Systems, Cambridge, MA (2005)

    Google Scholar 

  16. Zhao, Z., Liu, H.: Spectral Feature Selection for Supervised and Unsupervised Learning. In: Proceedings of International Conference on Machine Learning (ICML). ACM, New York (2007)

    Google Scholar 

  17. Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington, DC, USA (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yan, H. (2013). Sparsity Preserving Score for Joint Feature Selection. In: Sun, C., Fang, F., Zhou, ZH., Yang, W., Liu, ZY. (eds) Intelligence Science and Big Data Engineering. IScIDE 2013. Lecture Notes in Computer Science, vol 8261. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42057-3_80

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-42057-3_80

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-42056-6

  • Online ISBN: 978-3-642-42057-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics