Abstract
In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an \(\varepsilon \)-critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.
Similar content being viewed by others
References
An, L.T.H., Ngai, H.V., Tao, P.D.: Exact penalty and error bounds in DC programming. J. Glob. Optim. 52(3), 509–535 (2012)
An, L.T.H., Tao, P.D.: Solving a class of linearly constrained indefinite quadratic problems by D.C. algorithms. J. Glob. Optim. 11(3), 253–285 (1997)
An, L.T.H., Tao, P.D.: The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems. J. Glob. Optim. 133(1), 23–46 (2005)
Astorino, A., Fuduli, A., Gaudioso, M.: DC models for spherical separation. J. Glob. Optim. 48(4), 657–669 (2010)
Astorino, A., Fuduli, A., Gaudioso, M.: Margin maximization in spherical separation. Comput. Optim. Appl. 53(2), 301–322 (2012)
Bagirov, A.M.: A method for minimizing of quasidifferentiable functions. Optim. Methods Softw. 17(1), 31–60 (2002)
Bagirov, A.M., Ganjehlou, A.N.: A quasisecant method for minimizing nonsmooth functions. Optim. Methods Softw. 25(1), 3–18 (2010)
Bagirov, A.M., Karmitsa, N., Mäkelä, M.M.: Introduction to Nonsmooth Optimization: Theory. Practice and Software. Springer, Cham (2014)
Bagirov, A.M., Ugon, J.: Codifferential method for minimizing nonsmooth DC functions. J. Glob. Optim. 50(1), 3–22 (2011)
Bagirov, A.M., Yearwood, J.: A new nonsmooth optimisation algorithm for minimum sum-of-squares clustering problems. Eur. J. Oper. Res. 170(2), 578–596 (2006)
Bihain, A.: Optimization of upper semidifferentiable functions. J. Optim. Theory Appl. 44(4), 545–568 (1984)
Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983)
Demyanov, V.F., Rubinov, A.M.: Constructive Nonsmooth Analysis (Approximation and Optimization), vol. 7. Peter Lang, Frankfurt am Main (1995)
Ferrer, A.: Representation of a polynomial function as a difference of convex polynomials with an application. Lecture Notes in Economics and Mathematical Systems 502, 189–207 (2001)
Ferrer, A., Martnez-Legaz, J.E.: Improving the efficiency of DC global optimization methods by improving the DC representation of the objective function. J. Glob. Optim. 43(4), 513–531 (2009). doi:10.1007/s10898-008-9343-5
Fuduli, A., Gaudioso, M., Giallombardo, G.: A DC piecewise affine model and a bundling technique in nonconvex nonsmooth minimization. Optim. Methods Softw. 19(1), 89–102 (2004)
Fuduli, A., Gaudioso, M., Giallombardo, G.: Minimizing nonconvex nonsmooth functions via cutting planes and proximity control. SIAM J. Optim. 14(3), 743–756 (2004)
Fuduli, A., Gaudioso, M., Nurminski, E.A.: A splitting bundle approach for non-smooth non-convex minimization. Optimization 64(5), 1131–1151 (2015)
Hartman, P.: On functions representable as a difference of convex functions. Pac. J. Math. 9(3), 707–713 (1959)
Hiriart-Urruty, J.B.: Generalized differentiability, duality and optimization for problems dealing with differences of convex functions. Lecture Notes in Economics and Mathematical Systems 256, 37–70 (1985)
Hiriart-Urruty, J.B.: From convex optimization to nonconvex optimization. Part I: Necessary and sufficient conditions for global optimality. Nonsmooth Optimization and Related Topics, Ettore Majorana International Sciences Series 43. Plenum Press (1988)
Holmberg, K., Tuy, H.: A production-transportation problem with stochastic demand and concave production costs. Math. Prog. 85(1), 157–179 (1999)
Horst, R., Thoai, N.V.: DC programming: Overview. J. Optim. Theory Appl. 103(1), 1–43 (1999)
Horst, R., Tuy, H.: Global Optimization: Deterministic Approaches, 1st edn. Springer, Heilderberg (1990)
Hou, L., Sun, W.: On the global convergence of a nonmonotone proximal bundle method for convex nonsmooth minimization. Optim. Methods Softw. 23(2), 227–235 (2008). doi:10.1080/10556780701549960
Kiwiel, K.C.: An aggregate subgradient method for nonsmooth convex minimization. Math. Prog. 27(3), 320–341 (1983)
Kiwiel, K.C.: Methods of Descent for Nondifferentiable Optimization. Lecture Notes in Mathematics, vol. 1133. Springer, Berlin (1985)
Kiwiel, K.C.: Proximity control in bundle methods for convex nondifferentiable minimization. Math. Prog. 46(1), 105–122 (1990)
Kiwiel, K.C.: Restricted step and Levenberg-Marquardt techniques in proximal bundle methods for nonconvex nondifferentiable optimization. SIAM J. Optim. 6(1), 227–249 (1996)
Lukšan, L.: Dual method for solving a special problem of quadratic programming as a subproblem at linearly constrained nonlinear minmax approximation. Kybernetika 20(6), 445–457 (1984)
Lukšan, L., Spedicato, E.: Variable metric methods for unconstrained optimization and nonlinear least squares. J. Comput. Appl. Math. 124(1–2), 61–95 (2000)
Mäkelä, M.M.: Survey of bundle methods for nonsmooth optimization. Optim. Methods Softw. 17(1), 1–29 (2002)
Mäkelä, M.M.: Multiobjective proximal bundle method for nonconvex nonsmooth optimization: Fortran subroutine MPBNGC 2.0. Reports of the Department of Mathematical Information Technology, Series B. Scientific Computing B 13/2003, University of Jyväskylä, Jyväskylä (2003)
Mäkelä, M.M., Neittaanmäki, P.: Nonsmooth Optimization: Analysis and Algorithms with Applications to Optimal Control. World Scientific Publishing Co., Singapore (1992)
Pey-Chun, C., Hansen, P., Jaumard, B., Tuy, H.: Solution of the multisource Weber and conditional Weber problems by d.c. programming. Oper. Res. 46(4), 548–562 (1998)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)
Schramm, H., Zowe, J.: A version of the bundle idea for minimizing a nonsmooth function: Conceptual idea, convergence analysis, numerical results. SIAM J. Optim. 2(1), 121–152 (1992)
Sun, W.Y., Sampaio, R.J.B., Candido, M.A.B.: Proximal point algorithm for minimization of DC functions. J. Comput. Math. 21(4), 451–462 (2003)
Tao, P.D., An, L.T.H.: Convex analysis approach to DC programming: Theory, algorithms and applications. Acta Math. Vietnam. 22(1), 289–355 (1997)
Toland, J.F.: On subdifferential calculus and duality in nonconvex optimization. Bull. Soc. Math. France, Mémoire 60, 173–180 (1979)
Tuy, H.: Convex Analysis and Global Optimization, 1st edn. Kluwer, Dordrescht (1998)
Acknowledgements
We are thankful to the anonymous referees for their valuable comments. We would also like to acknowledge Professors M. Gaudioso and A. Fuduli for kindly providing us the codes of NCVX and NCVX-penalty methods. This work has been financially supported by the Jenny and Antti Wihuri Foundation, the Turku University Foundation, the University of Turku, the Academy of Finland (project number: 289500) and the Australian Research Council’s Discovery Projects funding scheme (project number: DP140103213).
Author information
Authors and Affiliations
Corresponding author
Appendix: Test problems
Appendix: Test problems
All test problems are unconstrained DC optimization problems, where objective functions are presented as DC functions:
Therefore, in the description of all test problems we present only functions \(f_1\) and \(f_2\). The following notations are used to describe test problems:
-
\({\varvec{x}}_0 \in \mathbb {R}^n\) – starting point;
-
\({\varvec{x}}^* \in \mathbb {R}^n\) – known best solution;
-
\(f^*\) – known best value of the objective function.
Problem 1
[6]
-
Dimension: \(n = 2\),
-
Component functions:\(f_1({\varvec{x}}) = \max \{f_1^1({\varvec{x}}), f_1^2({\varvec{x}}), f_1^3({\varvec{x}})\}+f_2^1({\varvec{x}})+f_2^2({\varvec{x}})+f_2^3({\varvec{x}}),\)
-
\(~f_2({\varvec{x}}) = \max \{f_2^1({\varvec{x}})+f_2^2({\varvec{x}}), f_2^2({\varvec{x}})+f_2^3({\varvec{x}}), f_2^1({\varvec{x}})+ f_2^3({\varvec{x}})\},\)
-
\(f_1^1({\varvec{x}}) = x_1^4 + x_2^2, ~f_1^2({\varvec{x}}) = (2-x_1)^2 + (2-x_2)^2, ~f_1^3({\varvec{x}}) = 2e^{-x_1+x_2},\)
-
\(f_2^1({\varvec{x}}) = x_1^2 - 2x_1+x_2^2 - 4x_2 +4, ~f_2^2({\varvec{x}}) = 2x_1^2 - 5x_1+x_2^2 -2x_2 +4,\)
-
\(f_2^3({\varvec{x}}) = x_1^2 + 2x_2^2-4x_2+1,\)
-
Starting point: \({\varvec{x}}_0=(2,2)^T\),
-
Optimum point: \({\varvec{x}}^*=(1, 1)^T\),
-
Optimum value: \(f^*=2\).
Problem 2
[6]
-
Dimension: \(n = 2\),
-
Component functions: \(f_1({\varvec{x}}) = | x_1 - 1| + 200 \max \{0, | x_1| - x_2\},\)
-
\(f_2({\varvec{x}}) = 100( | x_1| - x_2),\)
-
Starting point: \({\varvec{x}}_0=(-1.2, 1)^T\),
-
Optimum point: \({\varvec{x}}^*=(1,1)^T\),
-
Optimum value: \(f^*=0\).
Problem 3
[6]
-
Dimension: \(n = 4\),
-
Component functions: \(f_1({\varvec{x}}) = | x_1 - 1| + 200 \max \{0, | x_1| - x_2\} \)
-
\(~~~~~~~~ + 180 \max \{0, | x_3| - x_4\} + |x_3-1|+10.1 (|x_2-1| + |x_4-1|) + 4.95 |x_2+x_4-2|\),
-
\(f_2({\varvec{x}}) = 100( | x_1| - x_2) + 90 ( | x_3| - x_4) + 4.95|x_2-x_4|\),
-
Starting point: \({\varvec{x}}_0=(1,3,3,1)^T\),
-
Optimum point: \({\varvec{x}}^*=(1,1,1,1)^T\),
-
Optimum value: \(f^*=0\).
Problem 4
[6]
-
Dimension: \(n = 2, 5, 10, 50, 100, 150, 200, 250, 350, 500, 750,\)
-
Component functions: \(f_1({\varvec{x}}) = n \max \left\{ |x_i|: ~i = 1,\ldots ,n \right\} ,\quad f_2({\varvec{x}}) = \sum _{i=1}^n |x_i|\),
-
Starting point: \({\varvec{x}}_0 = (i, ~i = 1,\ldots ,\left\lfloor n/2\right\rfloor , ~-i, ~i = \left\lfloor n/2\right\rfloor +1,\ldots ,n)^T\),
-
Optimum point: \({\varvec{x}}^*=(x_1^*, \ldots ,x_n^*)^T,\) \(x_i^*=\alpha \) or \(x_i^*=-\alpha \), \(\alpha \in \mathbb {R}\), \(i=1,\ldots ,n\),
-
Optimum value: \(f^*=0\).
Problem 5
[6]
-
Dimension: \(n = 2, 5, 10, 50, 100, 150, 200, 250, 300, 350, 400, 500, 1000,\)
-
1500, 3000, 10 000, 15 000, 20 000, 50 000,
-
Component functions: \(f_1({\varvec{x}}) = 20\, \mathrm{max} \left\{ \left| \sum _{i=1}^n(x_i - x_i^*)t_j^{i-1}\right| : j=1,\ldots ,20 \right\} \),
-
\(f_2({\varvec{x}}) = \sum _{j=1}^{20} \left| \sum _{i=1}^n(x_i - x_i^*)t_j^{i-1}\right| , ~t_j = 0.05j, ~j = 1,\ldots ,20\),
-
Starting point: \({\varvec{x}}_0 = (0,\ldots ,0)^T\),
-
Optimum point: \({\varvec{x}}^*=(1/n,\ldots ,1/n)^T\),
-
Optimum value: \(f^*=0\).
Problem 6
Problem 7
Problem 8
Problem 9
Problem 10
Rights and permissions
About this article
Cite this article
Joki, K., Bagirov, A.M., Karmitsa, N. et al. A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes. J Glob Optim 68, 501–535 (2017). https://doi.org/10.1007/s10898-016-0488-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-016-0488-3