Skip to main content
Log in

A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an \(\varepsilon \)-critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. An, L.T.H., Ngai, H.V., Tao, P.D.: Exact penalty and error bounds in DC programming. J. Glob. Optim. 52(3), 509–535 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  2. An, L.T.H., Tao, P.D.: Solving a class of linearly constrained indefinite quadratic problems by D.C. algorithms. J. Glob. Optim. 11(3), 253–285 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  3. An, L.T.H., Tao, P.D.: The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems. J. Glob. Optim. 133(1), 23–46 (2005)

    MathSciNet  MATH  Google Scholar 

  4. Astorino, A., Fuduli, A., Gaudioso, M.: DC models for spherical separation. J. Glob. Optim. 48(4), 657–669 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  5. Astorino, A., Fuduli, A., Gaudioso, M.: Margin maximization in spherical separation. Comput. Optim. Appl. 53(2), 301–322 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bagirov, A.M.: A method for minimizing of quasidifferentiable functions. Optim. Methods Softw. 17(1), 31–60 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Bagirov, A.M., Ganjehlou, A.N.: A quasisecant method for minimizing nonsmooth functions. Optim. Methods Softw. 25(1), 3–18 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  8. Bagirov, A.M., Karmitsa, N., Mäkelä, M.M.: Introduction to Nonsmooth Optimization: Theory. Practice and Software. Springer, Cham (2014)

    Book  MATH  Google Scholar 

  9. Bagirov, A.M., Ugon, J.: Codifferential method for minimizing nonsmooth DC functions. J. Glob. Optim. 50(1), 3–22 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Bagirov, A.M., Yearwood, J.: A new nonsmooth optimisation algorithm for minimum sum-of-squares clustering problems. Eur. J. Oper. Res. 170(2), 578–596 (2006)

    Article  MATH  Google Scholar 

  11. Bihain, A.: Optimization of upper semidifferentiable functions. J. Optim. Theory Appl. 44(4), 545–568 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  12. Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)

    MATH  Google Scholar 

  13. Demyanov, V.F., Rubinov, A.M.: Constructive Nonsmooth Analysis (Approximation and Optimization), vol. 7. Peter Lang, Frankfurt am Main (1995)

    MATH  Google Scholar 

  14. Ferrer, A.: Representation of a polynomial function as a difference of convex polynomials with an application. Lecture Notes in Economics and Mathematical Systems 502, 189–207 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  15. Ferrer, A., Martnez-Legaz, J.E.: Improving the efficiency of DC global optimization methods by improving the DC representation of the objective function. J. Glob. Optim. 43(4), 513–531 (2009). doi:10.1007/s10898-008-9343-5

    Article  MathSciNet  MATH  Google Scholar 

  16. Fuduli, A., Gaudioso, M., Giallombardo, G.: A DC piecewise affine model and a bundling technique in nonconvex nonsmooth minimization. Optim. Methods Softw. 19(1), 89–102 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  17. Fuduli, A., Gaudioso, M., Giallombardo, G.: Minimizing nonconvex nonsmooth functions via cutting planes and proximity control. SIAM J. Optim. 14(3), 743–756 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  18. Fuduli, A., Gaudioso, M., Nurminski, E.A.: A splitting bundle approach for non-smooth non-convex minimization. Optimization 64(5), 1131–1151 (2015)

  19. Hartman, P.: On functions representable as a difference of convex functions. Pac. J. Math. 9(3), 707–713 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  20. Hiriart-Urruty, J.B.: Generalized differentiability, duality and optimization for problems dealing with differences of convex functions. Lecture Notes in Economics and Mathematical Systems 256, 37–70 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  21. Hiriart-Urruty, J.B.: From convex optimization to nonconvex optimization. Part I: Necessary and sufficient conditions for global optimality. Nonsmooth Optimization and Related Topics, Ettore Majorana International Sciences Series 43. Plenum Press (1988)

  22. Holmberg, K., Tuy, H.: A production-transportation problem with stochastic demand and concave production costs. Math. Prog. 85(1), 157–179 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  23. Horst, R., Thoai, N.V.: DC programming: Overview. J. Optim. Theory Appl. 103(1), 1–43 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  24. Horst, R., Tuy, H.: Global Optimization: Deterministic Approaches, 1st edn. Springer, Heilderberg (1990)

    Book  MATH  Google Scholar 

  25. Hou, L., Sun, W.: On the global convergence of a nonmonotone proximal bundle method for convex nonsmooth minimization. Optim. Methods Softw. 23(2), 227–235 (2008). doi:10.1080/10556780701549960

    Article  MathSciNet  MATH  Google Scholar 

  26. Kiwiel, K.C.: An aggregate subgradient method for nonsmooth convex minimization. Math. Prog. 27(3), 320–341 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  27. Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization. Lecture Notes in Mathematics, vol. 1133. Springer, Berlin (1985)

  28. Kiwiel, K.C.: Proximity control in bundle methods for convex nondifferentiable minimization. Math. Prog. 46(1), 105–122 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  29. Kiwiel, K.C.: Restricted step and Levenberg-Marquardt techniques in proximal bundle methods for nonconvex nondifferentiable optimization. SIAM J. Optim. 6(1), 227–249 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  30. Lukšan, L.: Dual method for solving a special problem of quadratic programming as a subproblem at linearly constrained nonlinear minmax approximation. Kybernetika 20(6), 445–457 (1984)

    MathSciNet  MATH  Google Scholar 

  31. Lukšan, L., Spedicato, E.: Variable metric methods for unconstrained optimization and nonlinear least squares. J. Comput. Appl. Math. 124(1–2), 61–95 (2000)

    MathSciNet  MATH  Google Scholar 

  32. Mäkelä, M.M.: Survey of bundle methods for nonsmooth optimization. Optim. Methods Softw. 17(1), 1–29 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  33. Mäkelä, M.M.: Multiobjective proximal bundle method for nonconvex nonsmooth optimization: Fortran subroutine MPBNGC 2.0. Reports of the Department of Mathematical Information Technology, Series B. Scientific Computing B 13/2003, University of Jyväskylä, Jyväskylä (2003)

  34. Mäkelä, M.M., Neittaanmäki, P.: Nonsmooth Optimization: Analysis and Algorithms with Applications to Optimal Control. World Scientific Publishing Co., Singapore (1992)

    Book  MATH  Google Scholar 

  35. Pey-Chun, C., Hansen, P., Jaumard, B., Tuy, H.: Solution of the multisource Weber and conditional Weber problems by d.c. programming. Oper. Res. 46(4), 548–562 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  36. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)

    Book  MATH  Google Scholar 

  37. Schramm, H., Zowe, J.: A version of the bundle idea for minimizing a nonsmooth function: Conceptual idea, convergence analysis, numerical results. SIAM J. Optim. 2(1), 121–152 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  38. Sun, W.Y., Sampaio, R.J.B., Candido, M.A.B.: Proximal point algorithm for minimization of DC functions. J. Comput. Math. 21(4), 451–462 (2003)

    MathSciNet  MATH  Google Scholar 

  39. Tao, P.D., An, L.T.H.: Convex analysis approach to DC programming: Theory, algorithms and applications. Acta Math. Vietnam. 22(1), 289–355 (1997)

    MathSciNet  MATH  Google Scholar 

  40. Toland, J.F.: On subdifferential calculus and duality in nonconvex optimization. Bull. Soc. Math. France, Mémoire 60, 173–180 (1979)

  41. Tuy, H.: Convex Analysis and Global Optimization, 1st edn. Kluwer, Dordrescht (1998)

    Book  MATH  Google Scholar 

Download references

Acknowledgements

We are thankful to the anonymous referees for their valuable comments. We would also like to acknowledge Professors M. Gaudioso and A. Fuduli for kindly providing us the codes of NCVX and NCVX-penalty methods. This work has been financially supported by the Jenny and Antti Wihuri Foundation, the Turku University Foundation, the University of Turku, the Academy of Finland (project number: 289500) and the Australian Research Council’s Discovery Projects funding scheme (project number: DP140103213).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kaisa Joki.

Appendix: Test problems

Appendix: Test problems

All test problems are unconstrained DC optimization problems, where objective functions are presented as DC functions:

$$\begin{aligned} f({\varvec{x}}) = f_1({\varvec{x}}) - f_2({\varvec{x}}). \end{aligned}$$

Therefore, in the description of all test problems we present only functions \(f_1\) and \(f_2\). The following notations are used to describe test problems:

  • \({\varvec{x}}_0 \in \mathbb {R}^n\) – starting point;

  • \({\varvec{x}}^* \in \mathbb {R}^n\) – known best solution;

  • \(f^*\) – known best value of the objective function.

Problem 1

[6]

  • Dimension: \(n = 2\),

  • Component functions:\(f_1({\varvec{x}}) = \max \{f_1^1({\varvec{x}}), f_1^2({\varvec{x}}), f_1^3({\varvec{x}})\}+f_2^1({\varvec{x}})+f_2^2({\varvec{x}})+f_2^3({\varvec{x}}),\)

  • \(~f_2({\varvec{x}}) = \max \{f_2^1({\varvec{x}})+f_2^2({\varvec{x}}), f_2^2({\varvec{x}})+f_2^3({\varvec{x}}), f_2^1({\varvec{x}})+ f_2^3({\varvec{x}})\},\)

  • \(f_1^1({\varvec{x}}) = x_1^4 + x_2^2, ~f_1^2({\varvec{x}}) = (2-x_1)^2 + (2-x_2)^2, ~f_1^3({\varvec{x}}) = 2e^{-x_1+x_2},\)

  • \(f_2^1({\varvec{x}}) = x_1^2 - 2x_1+x_2^2 - 4x_2 +4, ~f_2^2({\varvec{x}}) = 2x_1^2 - 5x_1+x_2^2 -2x_2 +4,\)

  • \(f_2^3({\varvec{x}}) = x_1^2 + 2x_2^2-4x_2+1,\)

  • Starting point: \({\varvec{x}}_0=(2,2)^T\),

  • Optimum point: \({\varvec{x}}^*=(1, 1)^T\),

  • Optimum value: \(f^*=2\).

Problem 2

[6]

  • Dimension: \(n = 2\),

  • Component functions: \(f_1({\varvec{x}}) = | x_1 - 1| + 200 \max \{0, | x_1| - x_2\},\)

  • \(f_2({\varvec{x}}) = 100( | x_1| - x_2),\)

  • Starting point: \({\varvec{x}}_0=(-1.2, 1)^T\),

  • Optimum point: \({\varvec{x}}^*=(1,1)^T\),

  • Optimum value: \(f^*=0\).

Problem 3

[6]

  • Dimension: \(n = 4\),

  • Component functions: \(f_1({\varvec{x}}) = | x_1 - 1| + 200 \max \{0, | x_1| - x_2\} \)

  • \(~~~~~~~~ + 180 \max \{0, | x_3| - x_4\} + |x_3-1|+10.1 (|x_2-1| + |x_4-1|) + 4.95 |x_2+x_4-2|\),

  • \(f_2({\varvec{x}}) = 100( | x_1| - x_2) + 90 ( | x_3| - x_4) + 4.95|x_2-x_4|\),

  • Starting point: \({\varvec{x}}_0=(1,3,3,1)^T\),

  • Optimum point: \({\varvec{x}}^*=(1,1,1,1)^T\),

  • Optimum value: \(f^*=0\).

Problem 4

[6]

  • Dimension: \(n = 2, 5, 10, 50, 100, 150, 200, 250, 350, 500, 750,\)

  • Component functions: \(f_1({\varvec{x}}) = n \max \left\{ |x_i|: ~i = 1,\ldots ,n \right\} ,\quad f_2({\varvec{x}}) = \sum _{i=1}^n |x_i|\),

  • Starting point: \({\varvec{x}}_0 = (i, ~i = 1,\ldots ,\left\lfloor n/2\right\rfloor , ~-i, ~i = \left\lfloor n/2\right\rfloor +1,\ldots ,n)^T\),

  • Optimum point: \({\varvec{x}}^*=(x_1^*, \ldots ,x_n^*)^T,\) \(x_i^*=\alpha \) or \(x_i^*=-\alpha \), \(\alpha \in \mathbb {R}\), \(i=1,\ldots ,n\),

  • Optimum value: \(f^*=0\).

Problem 5

[6]

  • Dimension: \(n = 2, 5, 10, 50, 100, 150, 200, 250, 300, 350, 400, 500, 1000,\)

  • 1500, 3000, 10 000, 15 000, 20 000, 50 000, 

  • Component functions: \(f_1({\varvec{x}}) = 20\, \mathrm{max} \left\{ \left| \sum _{i=1}^n(x_i - x_i^*)t_j^{i-1}\right| : j=1,\ldots ,20 \right\} \),

  • \(f_2({\varvec{x}}) = \sum _{j=1}^{20} \left| \sum _{i=1}^n(x_i - x_i^*)t_j^{i-1}\right| , ~t_j = 0.05j, ~j = 1,\ldots ,20\),

  • Starting point: \({\varvec{x}}_0 = (0,\ldots ,0)^T\),

  • Optimum point: \({\varvec{x}}^*=(1/n,\ldots ,1/n)^T\),

  • Optimum value: \(f^*=0\).

Problem 6

figure a

Problem 7

figure b

Problem 8

figure c

Problem 9

figure d

Problem 10

figure e

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Joki, K., Bagirov, A.M., Karmitsa, N. et al. A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes. J Glob Optim 68, 501–535 (2017). https://doi.org/10.1007/s10898-016-0488-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-016-0488-3

Keywords

Mathematics Subject Classification

Navigation