Skip to main content

Feature Selection Based on a New Formulation of the Minimal-Redundancy-Maximal-Relevance Criterion

  • Conference paper
Pattern Recognition and Image Analysis (IbPRIA 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4477))

Included in the following conference series:

Abstract

This paper proposes an incremental method for feature selection, aimed at identifying attributes in a dataset that allow to buid good classifiers at low computational cost. The basis of the approach is the minimal-redundancy-maximal-relevance (mRMR) framework, which attempts to select features relevant for a given classification task, avoiding redundancy among them. Relevance and redundancy have been popularly defined in terms of information theory concepts. In this paper a modification of the mRMR framework is proposed, based on a more proper quantification of the redundancy among features. Experimental work on discrete–valued datasets shows that classifiers built using features selected by the proposed method are more accurate than the ones obtained using original mRMR features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  2. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97(1–2), 273–324 (1997)

    Article  MATH  Google Scholar 

  3. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: Criteria of max–dependency, max–relevance, and min–redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  4. Press, W.H., Flannery, B., Teukolsky, S.A., Vetterling, W.T.: Numerical recipes in c. Cambridge (1988)

    Google Scholar 

  5. Newman, D.J., Hettich, S., Blake, C., Merz, C.: UCI repository of machine learning databases (1998)

    Google Scholar 

  6. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines (2001), Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm

  7. Dietterich, T.G.: Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation 10(7), 1895–1923 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Joan Martí José Miguel Benedí Ana Maria Mendonça Joan Serrat

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Ponsa, D., López, A. (2007). Feature Selection Based on a New Formulation of the Minimal-Redundancy-Maximal-Relevance Criterion. In: Martí, J., Benedí, J.M., Mendonça, A.M., Serrat, J. (eds) Pattern Recognition and Image Analysis. IbPRIA 2007. Lecture Notes in Computer Science, vol 4477. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72847-4_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72847-4_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72846-7

  • Online ISBN: 978-3-540-72847-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics