Skip to main content

An Eager Regression Method Based on Best Feature Projections

  • Conference paper
  • First Online:
  • 707 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2070))

Abstract

This paper describes a machine learning method, called Regression by Selecting Best Feature Projections (RSBFP). In the training phase, RSBFP projects the training data on each feature dimension and aims to find the predictive power of each feature attribute by constructing simple linear regression lines, one per each continuous feature and number of categories per each categorical feature. Because, although the predictive power of a continuous feature is constant, it varies for each distinct value of categorical features. Then the simple linear regression lines are sorted according to their predictive power. In the querying phase of learning, the best linear regression line and thus the best feature projection are selected to make predictions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L, Friedman, J H, Olshen, R A and Stone, C J ‘Classification and Regression Trees’ Wadsworth, Belmont, California (1984)

    Google Scholar 

  2. Friedman, J H ‘Local Learning Based on Recursive Covering’ Department of Statistics, Stanford University (1996)

    Google Scholar 

  3. Weiss, S and Indurkhya, N ‘Rule-based Machine Learning Methods for Functional Prediction’ Journal of Artificial Intelligence Research Vol 3 (1995) pp 383–403

    MATH  Google Scholar 

  4. Aha, D, Kibler, D and Albert, M ‘Instance-based Learning Algorithms’ Machine Learning Vol 6 (1991) pp 37–66

    Google Scholar 

  5. Quinlan, J R ‘Learning with Continuous Classes’ Proceedings AI’92 Adams and Sterling (Eds) Singapore (1992) pp 343–348

    Google Scholar 

  6. Bratko, I and Karalic A ‘First Order Regression’ Machine Learning Vol 26 (1997) pp 147–176

    Article  MATH  Google Scholar 

  7. Karalic, A ‘Employing Linear Regression in Regression Tree Leaves’ Proceedings of ECAI’92 Vienna, Austria, Bernd Newmann (Ed.) (1992) pp 440–441

    Google Scholar 

  8. Friedman, J H ‘Multivariate Adaptive Regression Splines’ The Annals of Statistics Vol 19 No 1 (1991) pp 1–141

    Article  MATH  MathSciNet  Google Scholar 

  9. Breiman, L ‘Stacked Regressions’ Machine Learning Vol 24 (1996) pp 49–64

    MATH  MathSciNet  Google Scholar 

  10. Kibler, D, Aha D W and Albert, M K ‘Instance-based Prediction of Real-valued Attributes’ Comput. Intell. Vol 5 (1989) pp 51–57

    Article  Google Scholar 

  11. Weiss, S and Indurkhya, N ‘Optimized Rule Induction’ IEEE Expert Vol 8 No 6 (1993) pp 61–69

    Article  Google Scholar 

  12. Graybill, F, Iyer, H and Burdick, R ‘Applied Statistics’ Upper Saddle River, NJ (1998)

    Google Scholar 

  13. Aydin, T ‘Regression by Selecting Best Feature(s)’ M.S.Thesis, Computer Engineering, Bilkent University, September, (2000)

    Google Scholar 

  14. Aydin, T and Güvenir, H A ‘Regression by Selecting Appropriate Features’ Proceedings of TAINN’2000, Izmir, June 21-23, (2000), pp 73–82

    Google Scholar 

  15. Uysul, İ and Güvenir, H A ‘Regression on Feature Projections’ Knowledge-Based Systems, Vol.13, No:4, (2000), pp 207–214

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Aydin, T., Güvenir, H.A. (2001). An Eager Regression Method Based on Best Feature Projections. In: Monostori, L., Váncza, J., Ali, M. (eds) Engineering of Intelligent Systems. IEA/AIE 2001. Lecture Notes in Computer Science(), vol 2070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45517-5_25

Download citation

  • DOI: https://doi.org/10.1007/3-540-45517-5_25

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42219-8

  • Online ISBN: 978-3-540-45517-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics