Skip to main content

A Simple Feature Extraction for High Dimensional Image Representations

  • Conference paper
  • 3798 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3940))

Abstract

We investigate a method to find local clusters in low dimensional subspaces of high dimensional data, e.g. in high dimensional image descriptions. Using cluster centers instead of the full set of data will speed up the performance of learning algorithms for object recognition, and might also improve performance because overfitting is avoided. Using the Graz01 database, our method outperforms a current standard method for feature extraction from high dimensional image representations.

This work was presented in a preliminary version at the First Austrian Cognitive Vision Workshop (ACVW 05), Zell an der Pram, January 2005.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lowe, D.: Object recognition from local scale-invariant features. In: Seventh International Conference on Computer Vision, pp. 1150–1157 (1999)

    Google Scholar 

  2. Mikolajczyk, K., Schmid, C.: Indexing based on scale invariant interest points. In: Proceedings of the 8th International Conference on Computer Vision, Vancouver, Canada, pp. 525–531 (2001)

    Google Scholar 

  3. Mikolajczyk, K., Schmid, C.: An affine invariant interest point detector. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2350, pp. 128–142. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  4. Opelt, A., Fussenegger, M., Pinz, A., Auer, P.: Weak hypotheses and boosting for generic object detection and recognition. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3022, pp. 71–84. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  5. Dance, C., Willamowski, J., Csurka, G., Bray, C.: Categorizing nine visual classes with bags of keypoints (2004)

    Google Scholar 

  6. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  7. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hinneburg, A., Aggarwal, C.C., Keim, D.A.: What is the nearest neighbor in high dimensional spaces? The VLDB Journal, 506–515 (2000)

    Google Scholar 

  9. Aggarwal, C.C., Yu, S.P.: Finding generalized projected clusters in high dimensional spaces. In: SIGMOD 2000: Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pp. 70–81. ACM Press, New York (2000)

    Chapter  Google Scholar 

  10. Böhm, C., Kailing, K., Kriegel, H.P., Kröger, P.: Density connected clustering with local subspace preferences. In: Proceedings of the 4th IEEE International Conference on Data Mining, pp. 27–34 (2004)

    Google Scholar 

  11. Haussler, D.: Decision theoretic generalizations of the pac model for neural net and other learning applications. Inf. Comput. 100, 78–150 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  12. Kearns, M.J., Schapire, R.E., Sellie, L.: Toward efficient agnostic learning. Computational Learing Theory, 341–352 (1992)

    Google Scholar 

  13. Long, P.M.: The complexity of learning according to two models of a drifting environment. Machine Learning 37, 337–354 (1999)

    Article  MATH  Google Scholar 

  14. Demiriz, A., Bennett, K.P., Shawe-Taylor, J.: Linear Programming Boosting via Column Generation. Machine Learning 46, 225–254 (2002)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Savu-Krohn, C., Auer, P. (2006). A Simple Feature Extraction for High Dimensional Image Representations. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_11

Download citation

  • DOI: https://doi.org/10.1007/11752790_11

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34137-6

  • Online ISBN: 978-3-540-34138-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics