Skip to main content

Advertisement

Log in

A method combining genetic algorithm with simultaneous perturbation stochastic approximation for linearly constrained stochastic optimization problems

  • Published:
Journal of Combinatorial Optimization Aims and scope Submit manuscript

Abstract

This paper considers the optimization of linearly constrained stochastic problem which only noisy measurements of the loss function are available. We propose a method which combines genetic algorithm (GA) with simultaneous perturbation stochastic approximation (SPSA) to solve linearly constrained stochastic problems. The hybrid method uses GA to search for optimum over the whole feasible region, and SPSA to search for optimum at local region. During the GA and SPSA search process, the hybrid method generates new solutions according to gradient projection direction, which is calculated based on active constraints. Because the gradient projection method projects the search direction into the subspace at a tangent to the active constraints, it ensures new solutions satisfy all constraints strictly. This paper applies the hybrid method to nine typical constrained optimization problems and the results coincide with the ideal solutions cited in the references. The numerical results reveal that the hybrid method is suitable for multimodal constrained stochastic optimization problem. Moreover, each solution generated by the hybrid method satisfies all linear constraints strictly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Bazaraa M, Sherali H, Shetty CM (2006) Nonlinear programming: theory and algorithms. Wiley-Interscience, New Jersey

    Book  MATH  Google Scholar 

  • Buriol LS, Hirsch MJ, Pardalos PM (2010) A hybrid genetic algorithm for road congestion minimization. Optim Lett 4(4):619–633

  • Conn AR, Gould NIM, Toint PL (1991) A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J Numer Anal 28(2):545–572. doi:10.1137/0728030

    Article  MathSciNet  MATH  Google Scholar 

  • Conn AR, Gould NIM, Toint PL (1997) A globally convergent augmented lagrangian barrier algorithm for optimization with general inequality constraints and simple bounds. Math Comput 66(217):261–288

    Article  MathSciNet  MATH  Google Scholar 

  • Du D-Z, Zhang X (1986) A convergence theorem of Rosens gradient projection method. Math Program 36(2):135–144. doi:10.1007/BF02592021

    Article  MathSciNet  Google Scholar 

  • Du D-Z, Zhang X (1989) Global convergence of Rosen’s gradient projection method. Math Program 44(1—-3):357–366. doi:10.1007/BF01587098

  • Du D-Z, Pardalos PM (1999a) Handbook of combinatorial optimization, vol 1-3. Kluwer Academic Publishers, Norwell

  • Du D-Z, Pardalos PM (1999b) Handbook of combinatorial optimization, supplement volume A. Kluwer Academic Publishers, Norwell

  • Du D-Z, Pardalos PM (2005) Handbook of combinatorial optimization, supplement volume B. Kluwer Academic Publishers, Norwell

  • Dupuis P, Kushner H (1987) Asymptotic behavior of constrained stochastic approximations via the theory of large deviations. Prob Theory Relat Fields 75(2):223–244. doi:10.1007/BF00354035

    Article  MathSciNet  MATH  Google Scholar 

  • Floudas CA, Pardalos PM (2009) Encyclopedia of optimization, 2nd edn. Springer, New York

    Book  MATH  Google Scholar 

  • Floudas CA, Pardalos PM (2003) Frontiers in global optimization. Kluwer Academic Publishers, Norwell

    MATH  Google Scholar 

  • FengChang M (2009) Optimization method and matlab programming. Science Press, Beijing

    Google Scholar 

  • Fu M, Hill D (1997) Optimization of discrete event systems via simultaneous perturbation stochastic approximation. IIE Trans 29(3):233–243. doi:10.1023/A:1018523313043

    Google Scholar 

  • Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison Wesley, New York

    MATH  Google Scholar 

  • Goldberg DE (2002) The design of innovation: lessons from and for competent genetic algorithms. Kluwer Academic Publishers, Norwell

    Book  MATH  Google Scholar 

  • Giannessi F, Pardalos PM (eds) (2001) Optimization theory: recent developments from Matrahaza. Kluwer Academic Publishers, Norwell

    MATH  Google Scholar 

  • Gupta R, Agarwal M (2006) Penalty guided genetic search for redundancy optimization in multi-state series-parallel power system. J Comb Optim 12(3):257–277. doi:10.1007/s10878-006-9632-1

    Article  MathSciNet  MATH  Google Scholar 

  • Hock W, Schittkowski K (1980) Test examples for nonlinear programming codes. J Optim Theory Appl 30(1):127–129. doi:10.1007/BF00934594

    Article  MATH  Google Scholar 

  • Hadjisavvas N, Pardalos PM (2001) Advances in convex analysis and global optimization. Kluwer Academic Publishers, Norwel

    Book  MATH  Google Scholar 

  • Ji M, Jin Z, Tang H (2006) An improved simulated annealing for solving the linear constrained optimization problems. Appl Math Comput 183(1):251–259

    Article  MathSciNet  MATH  Google Scholar 

  • Kushner H, Clark D (1978) Weak convergence: constrained systems. Springer, New York

    Google Scholar 

  • Kushner H, Yin GG (1997) Weak convergence methods for general algorithms. Springer, New York

    Book  Google Scholar 

  • Kushner HJ, Gavin T (1974) Stochastic approximation type methods for constrained systems: algorithms and numerical results. IEEE Trans Autom Control 19(4):349–357. doi:10.1109/TAC.1974.1100580

    Article  MathSciNet  MATH  Google Scholar 

  • Kushner HJ, Lakshmivarahan S (1977) Numerical studies of stochastic approximation procedures for constrained problems. IEEE Trans Autom Control 22(3):428–439. doi:10.1109/TAC.1977.1101505

    Article  MathSciNet  MATH  Google Scholar 

  • Kushner HJ, Sanvicente E (1974) Penalty function methods for constrained stochastic approximation. J Math Anal Appl 46(2):499–512. doi:10.1016/0022-247X(74)90256-X

    Article  MathSciNet  MATH  Google Scholar 

  • Kushner HJ, Sanvicente E (1975) Stochastic approximation of constrained systems with system and constraint noise. Automatica 11(4):375–380. doi:10.1016/0005-1098(75)90086-2

    Article  MathSciNet  MATH  Google Scholar 

  • Michalewicz Z (1996) Genetic algorithms+data structures=evolution programs. Springer, Verlag Berlin Heidlberg

    Book  MATH  Google Scholar 

  • Okamoto T, Hirata H (2009) Constrained optimization using the chaotic Lagrangian method and the simultaneous perturbation gradient approximation. ICCAS-SICE, pp 741–746

  • Pardalos PM, Resende MGC (eds) (2002) Handbook of applied optimization. Oxford University Press, New York

    MATH  Google Scholar 

  • Pardalos PM, Rajasekaran S (eds) (2001) Handbook of randomization (2 volumes). Kluwer Academic Publishers, Norwell

    Google Scholar 

  • Pardalos PM, Du D-Z (2013) Handbook of combinatorial optimization, 2nd edn. Springer, New York

    Book  MATH  Google Scholar 

  • Pardalos PM, Romeijn E (2002) Handbook of global optimization: heuristic approaches, vol 2. Kluwer Academic Publishers, Norwell

    Book  MATH  Google Scholar 

  • Rosen JB (1960) The gradient projection method for nonlinear programming. Part I. Linear constraints. J Soc Ind Appl Math 8(1):181–217

    Article  MATH  Google Scholar 

  • Sadegh P (1997) Constrained optimization via stochastic approximation with a simultaneous perturbation gradient approximation. Automatica 33(5):889–892. doi:10.1016/S0005-1098(96)00230-0

    Article  MathSciNet  MATH  Google Scholar 

  • Spall JC (1998) Implementation of the simultaneous perturbation algorithm for stochastic optimization. IEEE Trans Aerosp Electron Syst 34(3):817–823. doi:10.1109/7.705889

    Article  Google Scholar 

  • Wang IJ, Spall JC (1998) A constrained simultaneous perturbation stochastic approximation algorithm based on penalty functionsIntelligent Control (ISIC), 1998. Held jointly with IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA), Intelligent Systems and Semiotics (ISAS), Proceedings pp 452–458. doi:10.1109/ISIC.1998.713704.

  • XiaoPing W (2002) Genetic algorithm: theory, application and programming. Xian Jiaotong University Press, Xian

    Google Scholar 

Download references

Acknowledgments

Thanks for the supports of National Natural Science Foundation of China (Grant Nos. 61273174, 61034006, and 60874047).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhang Huajun.

Appendix

Appendix

  1. 1.

    Test function 1

    $$\begin{aligned} \begin{array}{l@{\quad }l} \min &{} f = - 3{\left( {1 - {x_1}} \right) ^2}{e^{ - x_1^2 - {{\left( {{x_2} + 1} \right) }^2}}} + 10\left( {0.2{x_1} - x_1^3 - x_2^5} \right) {e^{ - x_1^2 - x_2^2}}\\ &{} + \frac{1}{3}{e^{{{\left( {1 - {x_1}} \right) }^2} - x_2^2}}\\ \hbox {Subject to: } &{} {x_1} - {x_2} \ge - 6,\, {x_1} + {x_2} \ge - 6,\, - 4 \le {x_1} \le 4,\, - 4 \le {x_2} \le 4. \end{array} \end{aligned}$$

    The global solution is \({x^ * } = \left( {\begin{array}{cc} {0.000474}&{1.580535} \end{array}} \right) \), and \(f\left( x^* \right) = - 8.105417\).

  2. 2.

    Test function 2

    $$\begin{aligned} \begin{array}{ll} \min &{} f = \left( {1 + {{\left( {{x_1} + {x_2} + 1} \right) }^2}\left( {19 - 14{x_1} + 3x_1^2 + 6{x_1}{x_2} + 3x_2^2} \right) } \right) \\ &{} \times \left( {30 + {{\left( {2{x_1} - 3{x_2}} \right) }^2}\left( {18 - 32{x_1} + 12x_1^2 + 48{x_2} - 36{x_1}{x_2} + 27x_2^2} \right) } \right) \\ \hbox {Subject to:} &{} - 2 < {x_i} < 2,\, i = 1,2. \end{array} \end{aligned}$$

    The global solution is \({x^ * } = \left( {0, - 1} \right) \), and \(f\left( {{x^ * }} \right) = 3\).

  3. 3.

    Test function 3

    $$\begin{aligned} \begin{array}{ll} \min &{} f = {\left( {{x_2} - \frac{5.1}{4{\pi ^2}}x_1^2 + \frac{5}{\pi }{x_1} - 6} \right) ^2} + 10\left( {1 - \frac{1}{{8\pi }}} \right) \cos \left( {{x_1}} \right) + 10 \\ \hbox {Subject to:} &{} - 5 \le {x_1} \le 10,\, 0 \le {x_2} \le 15 \end{array} \end{aligned}$$

    The problem has three global solutions:

    $$\begin{aligned} x_1^ * = \left( { - \pi ,12.275} \right) ,\, x_2^ * = \left( {\pi ,2.275} \right) ,\, x_3^ * = \left( {3\pi ,2.475} \right) . \end{aligned}$$

    In all cases \(f\left( {x_i^ * } \right) = \frac{5}{{4\pi }},\, i = 1,2,3\).

  4. 4.

    Test function 4

    $$\begin{aligned} \begin{array}{ll} \min &{} f = \left\{ {\begin{array}{l} {{x_2} + {{10}^{ - 5}}{{\left( {{x_2} - {x_1}} \right) }^2} - 1,0 \le {x_1} < 2}\\ {\frac{1}{{27\sqrt{3} }}\left( {{{\left( {{x_1} - 3} \right) }^2} - 9} \right) x_2^3,2 \le {x_1} < 4}\\ {\frac{1}{3}{{\left( {{x_1} - 2} \right) }^3} + {x_2} - \frac{11}{3},4 \le {x_1} < 6} \end{array}} \right. \\ \hbox {subject to:} &{} \frac{1}{\sqrt{3} }{x_1} - {x_2} \ge 0,\, - {x_1} - \sqrt{3} {x_2} + 6 \ge 0,\, 0 \le {x_1} \le 6,\, {x_2} \ge 0. \end{array} \end{aligned}$$

    The problem has three global solutions: \(x_1^ * = \left( {0,0} \right) ,\, x_2^ * = \left( {3,\sqrt{3} } \right) ,\, x_3^ * = \left( {4,0} \right) \). in all cases \(f\left( {x_i^ * } \right) = - 1,\, i = 1,2,3\).

  5. 5.

    Test function 5

    $$\begin{aligned} \begin{array}{ll} \min &{} f = \left( {4 - 2.1x_1^2 + \frac{1}{3}x_1^4} \right) x_1^2 + {x_1}{x_2} + \left( { - 4 + 4x_2^2} \right) x_2^2 \\ \hbox {Subject to:} &{} - 3 \le {x_1} \le 3,\, - 2 \le {x_2} \le 2. \end{array} \end{aligned}$$

    The problem has two global solutions: \(x_1^ * = \left( { - 0.0898,0.7126} \right) ,\, x_2^ * = \left( {0.0898, - 0.7126} \right) \), and \(f\left( {x_1^ * } \right) = f\left( {x_2^ * } \right) = - 1.0316\).

  6. 6.

    Test function 6

    $$\begin{aligned} \begin{array}{ll} \min &{} f = x_1^{0.6} + x_2^{0.6} - 6{x_1} - 4{x_3} + 3{x_4} \\ \hbox {Subject to:} &{} - 3{x_1} + {x_2} - 3{x_3} = 0,\, {x_1} + 2{x_3} \le 4,\, {x_2} + 2{x_4} \le 4,\, {x_1} \le 3,\\ &{} {x_4} \le 1,\, {x_i} \ge 0,\, i = 1,2,3,4. \end{array} \end{aligned}$$

    The best known global solution is \({x^ * } = \left( {\frac{4}{3},4,0,0} \right) \), and \(f\left( {{x^ * }} \right) = - 4.5142\).

  7. 7.

    Test function 7

    $$\begin{aligned} \begin{array}{ll} \min &{} f = 6.5{x_1} - 0.5x_1^2 - {x_2} - 2{x_3} - 3{x_4} - 2{x_5} - {x_6} \\ \hbox {Subject to:} &{} {x_1} + 2{x_2} + 8{x_3} + {x_4} + 3{x_5} + 5{x_6} \le 16,\, - 8{x_1} - 4{x_2}\\ &{}\quad - 2{x_3} + 2{x_4} + 4{x_5} - {x_6} \le - 1\\ &{} 2{x_1} + 0.5{x_2} + 0.2{x_3} - 3{x_4} - {x_5} - 4{x_6} \le 24,\, 0.2{x_1}\\ &{}\quad + 2{x_2} + 0.1{x_3} - 4{x_4} + 2{x_5} + 2{x_6} \le 12\\ &{} - 0.1{x_1} - 0.5{x_2} + 2{x_3} + 5{x_4} - 5{x_5} + 3{x_6} \le 3,\, {x_4} \le 1,\\ &{}\quad {x_5} \le 1,\, {x_6} \le 2,\, {x_i} \ge 0,\, i = 1,2,3,4,5,6. \end{array} \end{aligned}$$

    The global solution is \({x^ * } = \left( {0,6,0,1,1,0} \right) \), and \(f\left( {{x^ * }} \right) = - 11\).

  8. 8.

    Test function 8

    $$\begin{aligned} \begin{array}{ll} \min &{} f = \sum \nolimits _{j = 1}^{10} {{x_j}\left( {c_j} + \ln \frac{x_j}{\sum \nolimits _{i = 1}^{10} {{x_i}}} \right) } \\ \hbox {subject to:} &{} {x_1} + 2{x_2} + 2{x_3} + {x_6} + {x_{10}} = 2,\, {x_4} + 2{x_5} + {x_6} + {x_7} = 1, \\ &{} {x_3} + {x_7} + {x_8} + 2{x_9} + {x_{10}} = 1,\, {x_i} \ge 0.000001,\, i = 1, \cdots ,10. \end{array} \end{aligned}$$

    Where \(c = ( - 6.089, - 17.164, - 34.054, - 5.914, - 24.721, - 14.986, - 24.1, - 10.708, - 26.663, - 22.179)\). The global solution is

    $$\begin{aligned} {x^ * }&= (0.04034785,0.15386976,0.77497089,0.00167479,0.48468539,\\&\quad 0.00068965,0.02826479,0.01849179,0.03849563,0.10128126),\\&\quad \hbox { and } f\left( {{x^ * }} \right) = - 47.760765. \end{aligned}$$
  9. 9.

    Test function 9

    $$\begin{aligned} \begin{array}{ll} \min &{} f\left( x \right) = \sum \nolimits _{k = 0}^4 \left( 2.3{x_{3k + 1}} + 0.0001x_{3k + 1}^2 + 1.7{x_{3k + 2}}\right. \\ &{}\quad \quad \quad \quad \ \left. +\, 0.0001x_{3k + 2}^2 + 2.2{x_{3k + 3}} + 0.0001x_{3k + 3}^2\right) \\ \hbox {subject to:} &{} 0 \le {x_{3k + 1}} - {x_{3k - 2}} + 7 \le 13,\, 0 \le {x_{3k + 2}} - {x_{3k - 1}} + 7 \le 14 \\ &{} 0 \le {x_{3k + 3}} - {x_{3k}} + 7 \le 13,\, {x_1} + {x_2} + {x_3} - 60 \ge 0 \\ &{} {x_4} + {x_5} + {x_6} - 50 \ge 0, {x_7} + {x_8} + {x_9} - 70 \ge 0 \\ &{} {x_{10}} + {x_{11}} + {x_{12}} - 85 \ge 0, {x_{13}} + {x_{14}} + {x_{15}} - 100 \ge 0 \\ &{} 8 \le {x_1} \le 21, 43 \le {x_2} \le 57, 3 \le {x_3} \le 16 \\ &{} 0 \le {x_{3k + 1}} \le 90, 0 \le {x_{3k + 2}} \le 120, 0 \le {x_{3k + 3}} \le 60, k = 1, \cdots ,4 \end{array} \end{aligned}$$

    The global solution is

    $$\begin{aligned} {x^ * } = \left( {8,49,3,1,56,0,1,63,6,3,70,12,5,77,18} \right) , \hbox { and } f\left( {{x^ * }} \right) = 664.82045. \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huajun, Z., Jin, Z. & Hui, L. A method combining genetic algorithm with simultaneous perturbation stochastic approximation for linearly constrained stochastic optimization problems. J Comb Optim 31, 979–995 (2016). https://doi.org/10.1007/s10878-014-9803-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10878-014-9803-4

Keywords

Navigation