Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4065))

Included in the following conference series:

  • 1792 Accesses

Abstract

In data mining we come across many problems such as function optimization problem or parameter estimation problem for classifiers for which a good learning algorithm for searching is very much necessary. In this paper we propose a stochastic based derivative free algorithm for unconstrained optimization problem. Many derivative-based local search methods exist which usually stuck into local solution for non-convex optimization problems. On the other hand global search methods are very time consuming and works for only limited number of variables. In this paper we investigate a derivative free multi search gradient based method which overcomes the problems of local minima and produces global solution in less time. We have tested the proposed method on many benchmark dataset in literature and compared the results with other existing algorithms. The results are very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fletcher, R.: Practical Methods of Optimization, 2nd edn. John Wiley and Sons, Chichester (1987)

    MATH  Google Scholar 

  2. Bagirov, A.M.: Derivative-free methods for unconstrained nonsmooth optimization and its numerical analysis. Investigacao Operacional, 19–75 (1999)

    Google Scholar 

  3. Hiriart-Urruty, J.B., Lemarechal, C.: Convex Analysis and Minimization Algorithms. Springer, New York (1993)

    Google Scholar 

  4. Clarke, F.: Optimization and Non-smooth Analysis. John Wiley & Sons, New York (1983)

    Google Scholar 

  5. Wolfe, P.: Finding the nearest point in a polytope. Mathematical Programming 11(2), 128–149 (1976)

    Article  MATH  MathSciNet  Google Scholar 

  6. Browein, J.: A note on the existence of subgradients. Mathematical Programming 24(2), 225–228 (1982)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ghosh, R., Ghosh, M., Bagirov, A. (2006). Derivative Free Stochastic Discrete Gradient Method with Adaptive Mutation. In: Perner, P. (eds) Advances in Data Mining. Applications in Medicine, Web Mining, Marketing, Image and Signal Mining. ICDM 2006. Lecture Notes in Computer Science(), vol 4065. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11790853_21

Download citation

  • DOI: https://doi.org/10.1007/11790853_21

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36036-0

  • Online ISBN: 978-3-540-36037-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics