Skip to main content

Model Selection for Support Vector Machines Using Ant Colony Optimization in an Electronic Nose Application

  • Conference paper
Ant Colony Optimization and Swarm Intelligence (ANTS 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4150))

  • 3563 Accesses

Abstract

Support vector machines, especially when using radial basis kernels, have given good results in the classification of different volatile compounds. We can achieve a feature extraction method adjusting the parameters of a modified radial basis kernel, giving more importance to those features that are important for classification proposes. However, the function that has to be minimized to find the best scaling factors is not derivable and has multiple local minima. In this work we propose to adapt the ideas of the ant colony optimization method to find an optimal value of the kernel parameters.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Hines, E., Llobet, E., Gardner, J.: Electronic noses: a review of signal processing techniques. IEE Proc. Circuits Dev. and Systems 146, 297–310 (1999)

    Article  Google Scholar 

  2. Distante, C., Leo, M., Siciliano, P., Persaud, K.: On the study of feature extraction methods for an electronic nose. Sensors and Actuators B: Chem. 87, 274–288 (2002)

    Article  Google Scholar 

  3. Vapnik, N.V.: The Natureof Statistical Learning Theory. Springer, New York (2000) (1 edn., 1998)

    Google Scholar 

  4. Al-Khalifa, S., Maldonado, S., Gardner, J.: Identification of co and no2 using a thermally resistive microsensor and support vector machine. IEE Proc. Science Meas. and Tech. 150(6), 11–14 (2003)

    Article  Google Scholar 

  5. Pardo, M., Sberveglieri, G., Gardini, S., Dalcanale, E.: Classification of electronic nose data with support vector machines. Sensors and Actuators B: Chem. 107, 730–737 (2005)

    Article  Google Scholar 

  6. Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1), 131–159 (2002)

    Article  MATH  Google Scholar 

  7. Dorigo, M., Stützle, T.: Ant Colony Optimization. MIT Press, Cambridge (2004)

    Book  MATH  Google Scholar 

  8. Dorigo, M., Di Caro, G.: The ant colony optimization meta-heuristic. In: Corne, D., Dorigo, M., Glover, F. (eds.) New Ideas in Optimization, pp. 11–32. McGraw-Hill, London (1999)

    Google Scholar 

  9. Platt, J.: Fast training of svms using sequential minimal optimization. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods – Support Vector Learning, pp. 185–208. MIT Press, Cambridge (1998)

    Google Scholar 

  10. Joachims, T.: Estimating the generalization performance of a SVM efficiently. In: Langley, P. (ed.) Proc. of ICML 2000, pp. 431–438. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Acevedo, J., Maldonado, S., Lafuente, S., Gomez, H., Gil, P. (2006). Model Selection for Support Vector Machines Using Ant Colony Optimization in an Electronic Nose Application. In: Dorigo, M., Gambardella, L.M., Birattari, M., Martinoli, A., Poli, R., Stützle, T. (eds) Ant Colony Optimization and Swarm Intelligence. ANTS 2006. Lecture Notes in Computer Science, vol 4150. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11839088_47

Download citation

  • DOI: https://doi.org/10.1007/11839088_47

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38482-3

  • Online ISBN: 978-3-540-38483-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics