Skip to main content

Increasing Classification Robustness with Adaptive Features

  • Conference paper
Computer Vision Systems (ICVS 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5008))

Included in the following conference series:

Abstract

In machine vision features are the basis for almost any kind of high-level postprocessing such as classification. A new method is developed that uses the inherent flexibility of feature calculation to optimize the features for a certain classification task. By tuning the parameters of the feature calculation the accuracy of a subsequent classification can be significantly improved and the decision boundaries can be simplified. The focus of the methods is on surface inspection problems and the features and classifiers used for these applications.

This work was funded by the EC under grant no. 016429, project DynaVis. It reflects only the authors’ views.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rao, C.R.: Linear statistical inference and its applications. John Wiley & Sons, Inc. New York (1965)

    MATH  Google Scholar 

  2. Hand, D.J.: Discrimination and classification. Wiley Series in Probability and Mathematical Statistics. Wiley, Chichester (1981)

    MATH  Google Scholar 

  3. Costanza, C.M., Afifi, A.A.: Comparison of stopping rules in forward stepwise discriminant analysis. J. Amer. Statist. Assoc. 74, 777–785 (1979)

    Article  MATH  Google Scholar 

  4. Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  5. Dash, M., Liu, H.: Feature selection for classification. International Journal of Intelligent Data Analysis 1, 131–156 (1997)

    Article  Google Scholar 

  6. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  7. Kononenko, I.: Estimating attributes: Analysis and extensions of relief. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, Springer, Heidelberg (1994)

    Google Scholar 

  8. Reisert, M., Burkhardt, H.: Feature Selection for Retrieval Purposes. In: Campilho, A., Kamel, M. (eds.) ICIAR 2006. LNCS, vol. 4141, pp. 661–672. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  9. Cardie, C.: Using decision trees to improve case-based learning. In: Proceedings of 10th International Conference on Machine Learning, pp. 25–32 (1993)

    Google Scholar 

  10. Molina, L.C., Belanche, L., Nebot, A.: Feature Selection Algorithms: A Survey and Experimental Evaluation. In: ICDM 2002: Proceedings of the 2002 IEEE International Conference on Data Mining, p. 306 (2002)

    Google Scholar 

  11. Chen, H.T., Liu, T.L., Fuh, C.S.: Probabilistic Tracking with Adaptive Feature Selection, icpr. In: 17th International Conference on Pattern Recognition (ICPR 2004), vol. 2, pp. 736–739 (2004)

    Google Scholar 

  12. Collins, R., Liu, Y.: On-line selection of discriminative tracking features. In: Proceedings of the 2003 International Conference of Computer Vision ICCV 2003 (October 2003)

    Google Scholar 

  13. Krishnapuram, B., Hartemink, A.J., Carin, L., Figueiredo, M.A.T.: A Bayesian Approach to Joint Feature Selection and Classifier Design. In: IEEE Transactions on PAMI, September 2004, vol. 26(9), pp. 1105–1111 (2004)

    Google Scholar 

  14. Yao, D., Azimi-Sadjadi, M.R., Dobeck, G.J.: Adaptive feature mapping for underwater target classification. IJCNN 1999. International Joint Conference on Neural Networks 5, 3221–3224 (1999)

    Article  Google Scholar 

  15. Narendra, P., Fukunaga, K.: A Branch and Bound Algorithm for Feature Subset Selection. IEEE Transactions on Computer C26(9), 917–922 (1977)

    Article  Google Scholar 

  16. Boros, E., Hammer, L.P.: Discrete Optimization, February 2003. JAI Press (2003)

    Google Scholar 

  17. Dennis, J.E., Schnabel, R.S.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics. SIAM (1987)

    Google Scholar 

  18. Eitzinger, C., Plach, H.: A New Approach to Perceptron Training. IEEE Transactions on Neural Networks 14(1), 216–221 (2003)

    Article  Google Scholar 

  19. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. Chapman and Hall, Boca Raton (1993)

    Google Scholar 

  20. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, New York, Berlin, Heidelberg, Germany (2001)

    MATH  Google Scholar 

  21. Lughofer, E.: Extensions of vector quantization for incremental clustering. Pattern Recognition 41(3), 995–1011 (2008)

    Article  MATH  Google Scholar 

  22. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers Inc., U.S.A (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Antonios Gasteratos Markus Vincze John K. Tsotsos

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Eitzinger, C., Gmainer, M., Heidl, W., Lughofer, E. (2008). Increasing Classification Robustness with Adaptive Features. In: Gasteratos, A., Vincze, M., Tsotsos, J.K. (eds) Computer Vision Systems. ICVS 2008. Lecture Notes in Computer Science, vol 5008. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-79547-6_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-79547-6_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-79546-9

  • Online ISBN: 978-3-540-79547-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics