Skip to main content

Nature Inspiration for Support Vector Machines

  • Conference paper
Knowledge-Based Intelligent Information and Engineering Systems (KES 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4252))

Abstract

We propose in this paper a new kernel, suited for Support Vector Machines learning, which is inspired from the biological world. The kernel is based on Gabor filters that are a good model for the response of the cells in the primary visual cortex and have been shown to be very effective in processing natural images. Furthermore, we build a link between energy-efficiency, which is a driving force in biological processing systems, and good generalization ability of learning machines. This connection can be the starting point for developing new kernel-based learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anguita, D., Pischiutta, S., Ridella, S., Sterpi, D.: Feed–forward Support Vector Machines without Multipliers. IEEE Trans. on Neural Networks (in press, 2006)

    Google Scholar 

  2. Anguita, D., Boni, A., Ridella, S.: A Digital Architecture for Support Vector Machines: Theory, Algorithm and FPGA Implementation. IEEE Trans. on Neural Networks 14, 993–1009 (2003)

    Article  Google Scholar 

  3. Anthony, M., Bartlett, P.L.: Neural Networks Learning: Theoretical Foundations. Cambridge University Press, Cambridge (1999)

    Book  Google Scholar 

  4. Bartlett, P.L.: The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network. IEEE Transactions on Information Theory 44, 525–536 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  5. Chandrakasan, A., Brodersen, R.: Minimizing power consumption in digital CMOS circuits. Proc. of the IEEE 83, 498–523 (1995)

    Article  Google Scholar 

  6. Cortes, C., Vapnik, V.: Support–vector networks. Machine Learning 27, 273–297 (1991)

    Google Scholar 

  7. Genton, M.G.: Classes of kernel for machine learning. Journal of Machine Learning Research 2, 299–312 (2001)

    Article  Google Scholar 

  8. Herbrich, R.: Learning Kernel Classifiers: Theory and Algorithms. MIT Press, Cambridge (2002)

    Google Scholar 

  9. Herbrich, R., Graepel, T., Shawe-Taylor, J.: Sparsity vs. Margins for Linear Classifiers. In: Proc. of the 13th Conf. on Computational Learning Theory, pp. 304–308 (2000)

    Google Scholar 

  10. Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Reading (1997)

    Google Scholar 

  11. The International Technology Roadmap for Semiconductors. ITRS (2005), http://public.itrs.net

  12. Jones, J.P., Palmer, L.A.: An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex. J. Neurophysiol. 58, 1233–1258 (1987)

    Google Scholar 

  13. Mitchell, T.: Machine learning. McGraw-Hill, New York (1997)

    MATH  Google Scholar 

  14. Parhami, B.: Computer arithmetic: algorithms and hardware design. Oxford University Press, Oxford (2000)

    Google Scholar 

  15. Poggio, T., Girosi, F.: Networks for approximation and learning. Proc. of the IEEE 78, 1481–1497 (1987)

    Article  Google Scholar 

  16. Poggio, T., Girosi, F.: A Theory of Networks for Approximation and Learning. Technical Report 1140, MIT AI Lab (1989)

    Google Scholar 

  17. Poggio, T., Mukherjee, S., Rifkin, R., Rahklin, A., Verri, A.: b. Technical Report 198, MIT CBCL (2001)

    Google Scholar 

  18. Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42, 287–320 (2001)

    Article  MATH  Google Scholar 

  19. Rumelhart, D.E., McClelland, J.L.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)

    Google Scholar 

  20. Schmidhuber, J.: Discovering neural nets w1997ith low Kolmogorov complexity and high generalization capability. Neural Networks 10, 857–873 (1997)

    Article  Google Scholar 

  21. Schölkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., Vapnik, V.: Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Trans. on Signal Processing 45, 2758–2765 (1997)

    Article  Google Scholar 

  22. Shawe-Taylor, J., Bartlett, P.L., Williamson, R.C., Anthony, M.: Structural Risk Minimization over Data-dependent Hierarchies. IEEE Trans. on Information Theory 44, 1926–1940 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  23. Valiant, L.G.: A theory of the learnable. Comm. of the ACM 27, 1134–1142 (1984)

    Article  MATH  Google Scholar 

  24. Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Chichester (1998)

    MATH  Google Scholar 

  25. Vapnik, V.: The Elements of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (2000)

    Google Scholar 

  26. Vincent, B.T., Baddeley, R.J.: Synaptic energy efficiency in retinal processing. Vision Research 43, 1283–1290 (2003)

    Article  Google Scholar 

  27. Wang, Y., Chua, C.-S.: Face recognition using 2D and 3D images using 3D Gabor filters. Image and Vision Computing 23, 1018–1028 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Anguita, D., Sterpi, D. (2006). Nature Inspiration for Support Vector Machines. In: Gabrys, B., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2006. Lecture Notes in Computer Science(), vol 4252. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893004_57

Download citation

  • DOI: https://doi.org/10.1007/11893004_57

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46537-9

  • Online ISBN: 978-3-540-46539-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics