Abstract
In machine vision features are the basis for almost any kind of high-level postprocessing such as classification. A new method is developed that uses the inherent flexibility of feature calculation to optimize the features for a certain classification task. By tuning the parameters of the feature calculation the accuracy of a subsequent classification can be significantly improved and the decision boundaries can be simplified. The focus of the methods is on surface inspection problems and the features and classifiers used for these applications.
This work was funded by the EC under grant no. 016429, project DynaVis. It reflects only the authors’ views.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Rao, C.R.: Linear statistical inference and its applications. John Wiley & Sons, Inc. New York (1965)
Hand, D.J.: Discrimination and classification. Wiley Series in Probability and Mathematical Statistics. Wiley, Chichester (1981)
Costanza, C.M., Afifi, A.A.: Comparison of stopping rules in forward stepwise discriminant analysis. J. Amer. Statist. Assoc. 74, 777–785 (1979)
Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)
Dash, M., Liu, H.: Feature selection for classification. International Journal of Intelligent Data Analysis 1, 131–156 (1997)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)
Kononenko, I.: Estimating attributes: Analysis and extensions of relief. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, Springer, Heidelberg (1994)
Reisert, M., Burkhardt, H.: Feature Selection for Retrieval Purposes. In: Campilho, A., Kamel, M. (eds.) ICIAR 2006. LNCS, vol. 4141, pp. 661–672. Springer, Heidelberg (2006)
Cardie, C.: Using decision trees to improve case-based learning. In: Proceedings of 10th International Conference on Machine Learning, pp. 25–32 (1993)
Molina, L.C., Belanche, L., Nebot, A.: Feature Selection Algorithms: A Survey and Experimental Evaluation. In: ICDM 2002: Proceedings of the 2002 IEEE International Conference on Data Mining, p. 306 (2002)
Chen, H.T., Liu, T.L., Fuh, C.S.: Probabilistic Tracking with Adaptive Feature Selection, icpr. In: 17th International Conference on Pattern Recognition (ICPR 2004), vol. 2, pp. 736–739 (2004)
Collins, R., Liu, Y.: On-line selection of discriminative tracking features. In: Proceedings of the 2003 International Conference of Computer Vision ICCV 2003 (October 2003)
Krishnapuram, B., Hartemink, A.J., Carin, L., Figueiredo, M.A.T.: A Bayesian Approach to Joint Feature Selection and Classifier Design. In: IEEE Transactions on PAMI, September 2004, vol. 26(9), pp. 1105–1111 (2004)
Yao, D., Azimi-Sadjadi, M.R., Dobeck, G.J.: Adaptive feature mapping for underwater target classification. IJCNN 1999. International Joint Conference on Neural Networks 5, 3221–3224 (1999)
Narendra, P., Fukunaga, K.: A Branch and Bound Algorithm for Feature Subset Selection. IEEE Transactions on Computer C26(9), 917–922 (1977)
Boros, E., Hammer, L.P.: Discrete Optimization, February 2003. JAI Press (2003)
Dennis, J.E., Schnabel, R.S.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics. SIAM (1987)
Eitzinger, C., Plach, H.: A New Approach to Perceptron Training. IEEE Transactions on Neural Networks 14(1), 216–221 (2003)
Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. Chapman and Hall, Boca Raton (1993)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, New York, Berlin, Heidelberg, Germany (2001)
Lughofer, E.: Extensions of vector quantization for incremental clustering. Pattern Recognition 41(3), 995–1011 (2008)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers Inc., U.S.A (1993)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Eitzinger, C., Gmainer, M., Heidl, W., Lughofer, E. (2008). Increasing Classification Robustness with Adaptive Features. In: Gasteratos, A., Vincze, M., Tsotsos, J.K. (eds) Computer Vision Systems. ICVS 2008. Lecture Notes in Computer Science, vol 5008. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-79547-6_43
Download citation
DOI: https://doi.org/10.1007/978-3-540-79547-6_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-79546-9
Online ISBN: 978-3-540-79547-6
eBook Packages: Computer ScienceComputer Science (R0)