Skip to main content

Fast Sparse Multinomial Regression Applied to Hyperspectral Data

  • Conference paper
Image Analysis and Recognition (ICIAR 2006)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4142))

Included in the following conference series:

Abstract

Methods for learning sparse classification are among the state-of-the-art in supervised learning. Sparsity, essential to achieve good generalization capabilities, can be enforced by using heavy tailed priors/ regularizers on the weights of the linear combination of functions. These priors/regularizers favour a few large weights and many to exactly zero. The Sparse Multinomial Logistic Regression algorithm [1] is one of such methods, that adopts a Laplacian prior to enforce sparseness. Its applicability to large datasets is still a delicate task from the computational point of view, sometimes even impossible to perform. This work implements an iterative procedure to calculate the weights of the decision function that is O(m 2) faster than the original method introduced in [1] (m is the number of classes). The benchmark dataset Indian Pines is used to test this modification. Results over subsets of this dataset are presented and compared with others computed with support vector machines.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Krishnapuram, B., Carin, L., Figueiredo, M.A.T., Hartemink, A.J.: Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(6), 957–968 (2005)

    Article  Google Scholar 

  2. Landgrebe, D.A.: Signal Theory Methods in Multispectral Remote Sensing. John Wiley and Sons, Inc., Hoboken, New Jersey (2003)

    Book  Google Scholar 

  3. Vapnik, V.: Statistical Learning Theory. John Wiley, New York (1998)

    MATH  Google Scholar 

  4. Camps-Valls, G., Bruzzone, L.: Kernel-based methods for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing 43(6), 1351–1362 (2005)

    Article  Google Scholar 

  5. Tipping, M.: Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research 1, 211–244 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  6. Figueiredo, M.: Adaptive Sparseness for Supervised Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(9), 1150–1159 (2003)

    Article  Google Scholar 

  7. Csato, L., Opper, M.: Sparse online Gaussian processes. Neural Computation 14(3), 641–668 (2002)

    Article  MATH  Google Scholar 

  8. Lawrence, N.D., Seeger, M., Herbrich, R.: Fast sparse Gaussian process methods: The informative vector machine. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15, pp. 609–616. MIT Press, Cambridge (2003)

    Google Scholar 

  9. Krishnapuram, B., Carin, L., Hartemink, A.J.: Joint classifier and feature optimization for cancer diagnosis using gene expression data. In: Proceedings of the International Conference in Research in Computational Molecular Biology (RECOMB 2003), Berlin, Germany (2003)

    Google Scholar 

  10. Krishnapuram, B., Carin, L., Hartemink, A.J., Figueiredo, M.A.T.: A Bayesian approach to joint feature selection and classifier design. IEEE Transactions on Pattern Analysis and Machine Intelligence 26, 1105–1111 (2004)

    Article  Google Scholar 

  11. Quarteroni, A., Sacco, R., Saleri, F.: Numerical Mathematics. TAM Series, vol. 37. Springer, New York (2000)

    Google Scholar 

  12. Bioucas Dias, J.M.: Fast Sparse Multinomial Logistic Regression - Technical Report. Instituto Superior Técnico (2006), Available at: http://www.lx.it.pt/~bioucas/

  13. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning - Data Mining. Inference and Prediction. Springer, New York (2001)

    MATH  Google Scholar 

  14. Lange, K., Hunter, D., Yang, I.: Optimizing transfer using surrogate objective functions. Journal of Computational and Graphical Statistics 9, 1–59 (2000)

    Article  MathSciNet  Google Scholar 

  15. Landgrebe, D.A.: NW Indiana’s Indian Pine (1992), Available at: http://dynamo.ecn.purdue.edu/~biehl/MultiSpec/

  16. The MathWorks: MATLAB The Language of Technical Computing - Using MATLAB: version 6. The Math Works, Inc. (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Borges, J.S., Bioucas-Dias, J.M., Marçal, A.R.S. (2006). Fast Sparse Multinomial Regression Applied to Hyperspectral Data. In: Campilho, A., Kamel, M. (eds) Image Analysis and Recognition. ICIAR 2006. Lecture Notes in Computer Science, vol 4142. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11867661_63

Download citation

  • DOI: https://doi.org/10.1007/11867661_63

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44894-5

  • Online ISBN: 978-3-540-44896-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics