Skip to main content

Anytime Algorithm for Feature Selection

  • Conference paper
  • First Online:
Rough Sets and Current Trends in Computing (RSCTC 2000)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2005))

Included in the following conference series:

Abstract

Feature selection is used to improve performance of learning algorithms by finding a minimal subset of relevant features. Since the process of feature selection is computationally intensive, a trade-off between the quality of the selected subset and the computation time is required. In this paper, we are presenting a novel, anytime algorithm for feature selection, which gradually improves the quality of results by increasing the computation time. The algorithm is interruptible, i.e., it can be stopped at any time and provide a partial subset of selected features. The quality of results is monitored by a new measure: fuzzy information gain. The algorithm performance is evaluated on several benchmark datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. L. Blake and C. J. Merz,UCI Repository of machine learning databases, http://www.ics.uci.edu/~mlearn/MLRepository.html, 1998.

  2. T. M. Cover, Elements of Information Theory, Wiley, New York, 1991.

    MATH  Google Scholar 

  3. T. Dean and M. Boddy, An Analysis of Time-Dependent Planning, Proc. AAAI-88, pp.49–54, AAAI, 1988.

    Google Scholar 

  4. E. Eberbach, Expressiveness of $-Calculus: What Matters?, Proc. The Ninth Intern. Symp. on Intelligent Information Systems IIS'2000, Springer-Verlag, Bystra, Poland, June 2000.

    Google Scholar 

  5. E. Eberbach, Expressing Evolutionary Computation, Genetic Programming, Artificial Life, Autonomous Agents and DNA-Based Computing in $-Calculus-Revised Version, Proc. Congress on Evolutionary Computation CEC'2000, San Diego, CA, July 2000.

    Google Scholar 

  6. E.J. Horvitz, Reasoning about Beliefs and Actions under Computational Resource Constraints, Proc. of the 1987 Workshop on Uncertainty in AI, Seattle, Washington, 1987.

    Google Scholar 

  7. M. Last and O. Maimon, An Information-Theoretic Approach to Data Mining, Submitted to Publication, 1999.

    Google Scholar 

  8. M. Last and A. Kandel, Automated Perceptions in Data Mining, Proc. 1999 IEEE International Fuzzy Systems Conference, Seoul, Korea, pp. 190–197, 1999.

    Google Scholar 

  9. H. Liu and H. Motoda, Feature Selection for Knowledge Discovery and Data Mining, Kluwer, Boston, 1998.

    MATH  Google Scholar 

  10. O. Maimon, A. Kandel, and M. Last, Information-Theoretic Fuzzy Approach to Knowledge Discovery in Databases. In Advances in Soft Computing-Engineering Design and Manufacturing, R. Roy, T. Furuhashi and P. K. Chawdhry, Eds. Springer-Verlag, London, 1999.

    Google Scholar 

  11. S. Russell and E. Wefald, Do the Right Thing: Studies in Limited Rationality, The MIT Press, 1991.

    Google Scholar 

  12. J. R. Quinlan, Induction of Decision Trees, Machine Learning, vol. 1, no. 1, pp. 81–106, 1986.

    Google Scholar 

  13. J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo, CA, 1993.

    Google Scholar 

  14. S. Zilberstein, Operational Rationality through Compilation of Anytime Algorithms, Ph.D. Dissertation, University of California at Berkeley, 1993.

    Google Scholar 

  15. S. Zilberstein, Using Anytime Algorithms in Intelligent Systems, AI Magazine, vol. 17, no. 3, pp. 73–83, 1996.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Last, M., Kandel, A., Maimon, O., Eberbach, E. (2001). Anytime Algorithm for Feature Selection. In: Ziarko, W., Yao, Y. (eds) Rough Sets and Current Trends in Computing. RSCTC 2000. Lecture Notes in Computer Science(), vol 2005. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45554-X_66

Download citation

  • DOI: https://doi.org/10.1007/3-540-45554-X_66

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43074-2

  • Online ISBN: 978-3-540-45554-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics