Skip to main content
Log in

Convex Kernel Underestimation of Functions with Multiple Local Minima

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A function on Rn with multiple local minima is approximated from below, via linear programming, by a linear combination of convex kernel functions using sample points from the given function. The resulting convex kernel underestimator is then minimized, using either a linear equation solver for a linear-quadratic kernel or by a Newton method for a Gaussian kernel, to obtain an approximation to a global minimum of the original function. Successive shrinking of the original search region to which this procedure is applied leads to fairly accurate estimates, within 0.0001% for a Gaussian kernel function, relative to global minima of synthetic nonconvex piecewise-quadratic functions for which the global minima are known exactly. Gaussian kernel underestimation improves by a factor of ten the relative error obtained using a piecewise-linear underestimator (O.L. Mangasarian, J.B. Rosen, and M.E. Thompson, Journal of Global Optimization, Volume 32, Number 1, Pages 1–9, 2005), while cutting computational time by an average factor of over 28.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. V. Cherkassky and F. Mulier, Learning from Data-Concepts, Theory and Methods. John Wiley & Sons: New York, 1998.,

  2. S. Cost and S. Salzberg, “A weighted nearest neighbor algorithm for learning with symbolic features,” Machine Learning, vol. 10, pp. 57–58, 1993.,

  3. N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines. Cambridge University Press: Cambridge, MA, 2000.,

  4. L. Devroye and T.J. Wagner, “Nearest neighbor methods in discrimination,” in P.R. Krishnaiah and L.N. Kanal, (Eds.), Handbook of Statistics, Volume 2: Classification, Pattern Recognition and Reduction of Dimensionality, North-Holland Netherlands. 1982.,

  5. K.A. Dill, A.T. Phillips, and J. B. Rosen, “CGU: An algorithm for molecular structure prediction,” in L.T. Biegler et al. (Eds.), IMA Volumes in Mathematics and its Applications: Large Scale Optimization with Applications III: Molecular Structure and Optimization, 1997, pp. 1–22.,

  6. T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning. Springer-Verlag: New York, 2001.,

  7. Y.-J. Lee and O.L. Mangasarian, “SSVM: A smooth support vector machine,'' Computational Optimization and Applications, vol. 20, pp. 5–22, 2001. Data Mining Institute, University of Wisconsin, Technical Report 99-03. ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/99-03.ps.,

  8. O.L. Mangasarian, ``Generalized support vector machines,'' in A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurman (Eds.), Advances in Large Margin Classifiers, MIT Press: Cambridge, MA, 2000 pp. 135–146. ftp://ftp.cs.wisc.edu/math-prog/tech-reports/98-14.ps.,

  9. O.L. Mangasarian and D.R. Musicant, “Robust linear and support vector regression,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 9, pp. 950–955, 2000. ftp://ftp.cs.wisc.edu/pub/dmi/ tech-reports/99-09.ps.,

  10. O.L. Mangasarian and D.R. Musicant, “Large scale kernel regression via linear programming,'' Machine Learning, vol. 46, pp. 255–269, 2002. ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/99-02.ps.,

  11. O.L. Mangasarian, J.B. Rosen, and M.E. Thompson, “Global minimization via piecewise-linear underestimation,” Technical Report 03-03, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, June 2003. ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/03-03.ps. Journal of Global Optimization, vol. 32, no. 1, pp. 1–9, 2005.,

  12. J.C. Mitchell, A.T. Phillips, J.B. Rosen, and L.F. Ten Eyck, “Coupled optimization in protein docking,” in Optimization in Computational Chemistry and Molecular Biology, pp. 191–207, Kluwer Academic Publishers: Dordrecht, Netherlands, 2000.,

  13. A.T. Phillips, J.B. Rosen, and K.A. Dill, “Convex global underestimation for molecular stucture prediction,” in P.M. Pardalos et al. (Ed.), From Local to Global Optimization, pp. 1–18, Kluwer Academic Publishers: Dordrecht, Netherlands, 2001.,

  14. J.B. Rosen and R.F. Marcia, “Convex quadratic approximation,” Computational Optimization and Applications, vol. 28, pp. 173–184, 2004.,

  15. B. Schölkopf, P. Bartlett, A. Smola, and R. Williamson, “Shrinking the tube: A new support vector regression algorithm,” in M.S. Kearns, S.A. Solla, and D.A. Cohn (Eds.), Advances in Neural Information Processing Systems 11, MIT Press: Cambridge, MA, 1999. Available at http://www.kernel-machines.org/publications.html. pp. 330–336,

  16. B. Schölkopf and A. Smola, Learning with Kernels. MIT Press: Cambridge, MA, 2002.,

  17. A.N. Tikhonov and V.Y. Arsenin, Solutions of Ill–Posed Problems. John Wiley & Sons: New York, 1977.,

  18. V.N. Vapnik, The Nature of Statistical Learning Theory. 2nd edition, Springer: New York, 2000.,

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mangasarian, O.L., Rosen, J.B. & Thompson, M.E. Convex Kernel Underestimation of Functions with Multiple Local Minima. Comput Optim Applic 34, 35–45 (2006). https://doi.org/10.1007/s10589-005-3072-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-005-3072-1

Keywords

Navigation