Skip to main content

Fine-Tuned Constrained Nelder–Mead SOMA

  • Conference paper
  • First Online:
Book cover Soft Computing for Problem Solving

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 817))

  • 1099 Accesses

Abstract

A number of population-based constrained optimization techniques are offered in the literature. Usually, a least amount of 50 population size is required to solve constrained optimization problems proficiently that results in large computation cost also. In this paper, a hybrid optimization technique, constrained Nelder–Mead self-organizing migrating algorithm (C-NMSOMA), has been projected that works with 20 population size only. It works with aim not only to handle constraints but also to condense the computation cost also. To confirm the claim, an experiment has been conducted for a set of ten constraint optimization problems by varying the population size from 20 to 100. C-NMSOMA works best with population size 20. To show its effectiveness over other algorithms, a convincing comparison has been made between the best results available by these algorithms and results obtained by C-NMSOMA using population size 20 only. Experimental results demonstrate that the presented algorithm C-NMSOMA is a robust optimization technique that yields feasible and viable solutions in lesser number of function evaluations using lesser population size. It combines the features of self-organizing migrating algorithm (SOMA) and Nelder–Mead (NM) simplex search. NM simplex search has been used as a crossover operator to produce new individuals in the solution space. A constraint handling technique based on preserving the feasibility of solutions with initialized feasible population has been adopted.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bertsekas, D.P.: Constrained optimization and lagrange multiplier methods. In: Linear Network Optimization: Algorithms and Codes. M.I.T. Press London LTD. (1991)

    Google Scholar 

  2. Homaifar, A., Qi, C.X., Lai, S.H.: Constrained optimization via genetic algorithms. Simulation 62, 242–253 (1994)

    Article  Google Scholar 

  3. Fletcher, R.: An ideal penalty function constrained optimization. IMA J. Appl. Math. 15, 319–342 (1975)

    Article  MathSciNet  Google Scholar 

  4. Boggs, P.T., Tolle, Jon W.: Sequential quadratic programming. Acta Numer. 4 (1995)

    Article  MathSciNet  Google Scholar 

  5. Lasdon, L.S., Fox, R.L., Ratner, M.W.: Nonlinear optimization using the generalized reduced gradient method. RAIRO-Oper. Res. Rech. Oper. 8, 73–103 (1974)

    MathSciNet  MATH  Google Scholar 

  6. Dou, J., Wang, X.J.: An efficient evolutionary programming. In: International Symposium on Information Science and Engineering, ISISE’08, pp. 401– 404 (2008)

    Google Scholar 

  7. Michalewicz, Z.: Genetic algorithms, numerical optimization and constraints. In: Echelman, L.J. (ed.) Proceedings of the Sixth International Conference on Genetic Algorithms, pp. 151–158 (1995)

    Google Scholar 

  8. Deb, K.: An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 186, 311–338 (2000)

    Article  Google Scholar 

  9. Kirkpatrick, S., Gelatt Jr., C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983). https://doi.org/10.1126/science.220.4598.671

    Article  MathSciNet  MATH  Google Scholar 

  10. Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science (1995)

    Google Scholar 

  11. Lampinen, J.: A constraint handling approach for the differential evolution algorithm. In: Proceedings of the 2002 Congress on Evolutionary Computation (CEC’2002), IEEE Service Center, Piscaaway, New Jersey, pp. 1468–1473 (2002)

    Google Scholar 

  12. Yang, X.S.: Harmony search as a metaheuristic algorithm. In: Geem, Z.W. (ed.) Music-Inspired Harmony Search Algorithm. Studies in Computational Intelligence, vol. 191, pp. 1–14. Springer Berlin, Heidelberg (2009)

    Google Scholar 

  13. Rashedi, E., Nezamabadi-pour, H., Saaryazdi, S.: GSA: a gravitational search algorithm. Inf. Sci. 179, 2232–2248 (2009)

    Article  Google Scholar 

  14. Eskander, H., Sadollah, A., Bahreininejad, A., Hamdi, M.: Water cycle algorithm-A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 110–111, 151–166 (2012)

    Article  Google Scholar 

  15. Eskander, H., Sadollah, A., Bahreininejad, A., Hamdi, M.: Mine blast algorithm-A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 13, 2592–2612 (2013)

    Article  Google Scholar 

  16. Michalewicz, Z., Attia, N.F.: Evolutionary optimization of constrained problems. In: Proceedings of Third Annual Conference on Evolutionary Programming, World Scientific, River Edge, NJ, pp. 998–1008 (1994)

    Google Scholar 

  17. Joines, J.A., Houck, C.R.: On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GAs. In: Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, Orlando, FL, USA, pp. 587–602 (1994)

    Google Scholar 

  18. Smith, A.E., Coit, D.W.: Constraint handling techniques-penalty functions. In: Handbook of Evolutionary Computation, Institute of Physics Publishing and Oxford University Press, Bristol, U.K., Chapter C 5.2 (1997)

    Google Scholar 

  19. Coello, C.A.C.: Theoretical and numerical constraint handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput. Methods Appl. Mech. Eng. 191, 1245–1287 (2002)

    Article  MathSciNet  Google Scholar 

  20. Mezura-Montes, E., Coello, C.A.C.: A simple evolution strategy to solve constrained optimization problems. In: Cantú-Paz, E. et al. (eds.) Genetic and Evolutionary Computation—GECCO 2003. Lecture Notes in Computer Science, vol. 2723. Springer, Berlin, Heidelberg (2003)

    Chapter  Google Scholar 

  21. Mezura-Montes, E., Coello, C.A.C.: Adding a diversity mechanism to a simple evolutionary strategy to solve constrained optimization problems. In: Proceedings of IEEE International Congress on Evolutionary Computation (CEC’2003). IEEE Neural Network Society, vol. 1, pp. 6–13 (2003)

    Google Scholar 

  22. Deb, K.: An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 186, 311–338 (2000)

    Article  Google Scholar 

  23. Hu, X., Eberhart, R.: Solving constrained nonlinear optimization problems with particle swarm optimization. In: Proceedings of 6th World Multiconference on Systemics, Cybernetics and Informatics (SCI) (2002)

    Google Scholar 

  24. Sun, C., Zeng, J., Pan, J.: A new method for constrained optimization problems to produce initial values. In: Chinese Control and Decision Conference (CCDC’09), pp. 2690–2692 (2009)

    Google Scholar 

  25. Coello, C.A.C., Mezura-Montes, E.: Constraint-handling in genetic algorithms through the use of dominance based tournament selection. Adv. Eng. Inf. 16, 193–203 (2002)

    Article  Google Scholar 

  26. Deb, K., Agarwal, S.: A niched-penalty approach for constraint handling in genetic algorithms. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna (1999)

    Chapter  Google Scholar 

  27. Akhtar, S., Tai, K., Ray, T.: A socio-behavioural simulation model for engineering design optimization. Eng. Optim. 34, 341–354 (2002)

    Article  Google Scholar 

  28. Chelouah, R., Siarry, P.: A hybrid method combining continuous tabu search and Nelder-Mead simplex algorithms for the global optimization of multiminima functions. Eur. J. Oper. Res. 161, 636–654 (2005)

    Article  MathSciNet  Google Scholar 

  29. Durand, N., Alliot, J.: A combined Nelder-Mead simplex and genetic algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference GECCO’99, vol. 99, pp. 1–7 (1999)

    Google Scholar 

  30. Hedar, A., Fukushima, M.: Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Global Optim. 35, 521–549 (2006)

    Article  MathSciNet  Google Scholar 

  31. Long, Q., Wu, C.: A hybrid method combining genetic algorithm and hook-jeeves method for constrained global optimization. J. Ind. Manag. Optim. 10, 1279–1296 (2014)

    Article  MathSciNet  Google Scholar 

  32. Deep, K., Dipti: A self organizing migrating genetic algorithm for constrained optimization. Applied Mathematics and Computation, vol. 198, pp. 237–250 (2008)

    Google Scholar 

  33. Singh, D., Agrawal, S.: Self organizing migrating algorithm with Nelder-Mead crossover and log-logistic mutation for large scale optimization. In: Acharjya, D., Dehuri, S., Sanyal, S. (eds.) Computational Intelligence for Big Data Analysis. Adaptation, Learning, and Optimization, vol. 19, pp. 143–164. Springer (2015)

    Google Scholar 

  34. Pulido, G.T., Coello, C.A.C.: A constraint-handling mechanism for particle swarm optimization. In: IEEE Congress on Evolutionary Computation (CEC’2004), vol. 2, pp. 1396–1403 (2004)

    Google Scholar 

  35. Datta, R., Costa, M.F.P., Deb, K., Gaspar-Cunha, A.: An evolutionary algorithm based pattern search approach for constrained optimization. In: IEEE Congress on Evolutionary Computation (CEC), pp. 1355–1362 (2013)

    Google Scholar 

  36. Aguirre, A.H., Zavala, A.E.M., Diharce, E.V., Rionda, S.B.: COPSO: constrained optimization via PSO algorithm. In: Center for Research in Mathematics (CIMAT), Technical report No. I-07-04/22-02-2007

    Google Scholar 

  37. Stanarevic, N., Tuba, M., Bacanin, N.: Modified artificial bee colony algorithm for constrained problems optimization. Int. J. Math. Models Methods Appl. Sci. 5, 644–651 (2011)

    Google Scholar 

  38. Wu, J.-Y.: Solving constrained global optimization problems by using hybrid evolutionary computing and artificial life approaches. Math. Probl. Eng. 2012, Article ID 841410, 36 pp (2012). https://doi.org/10.1155/2012/841410

    MathSciNet  Google Scholar 

  39. Zhang, C., Li, X., Gao, L., Wu, Q.: An improved electromagnetism-like mechanism algorithm for constrained optimization. Expert Syst. Appl. (Elsevier) 40, 5621–5634 (2013)

    Article  Google Scholar 

  40. Zhao, W., Wang, L., Yin, Y., Wang, B., Wei, Y., Yin, Y.: An improved backtracking search algorithm for constrained optimization problems. In: Buchmann, R., Kifor, C.V., Yu, J. (eds.) Knowledge Science, Engineering and Management. KSEM’2014. Lecture Notes in Computer Science, vol 8793, pp. 222–233. Springer (2014)

    Google Scholar 

  41. Mezura-Montes, E., Coello, C.A.C., Tun-Morales, E.I.: Simple feasibility rules and differential evolution for constrained optimization. In: Monroy, R., Arroyo-Figueroa, G., Sucar, L.E., Sossa, H. (eds.) In: MICAI 2004: Advances in Artificial Intelligence. MICAI 2004. Lecture Notes in Computer Science, vol. 2972. Springer, Berlin, Heidelberg (2004)

    Chapter  Google Scholar 

  42. Zahara, E., Kao, Y.T.: Hybrid Nelder-Mead simplex search and particle swarm optimization for constrained engineering design problems. Expert Syst. Appl. (Elsevier) 36, 3880–3886 (2009)

    Article  Google Scholar 

  43. Runarson, T.P., Yao, X.: Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evol. Comput. 4, 284–294 (2000)

    Article  Google Scholar 

  44. Floudas, C.A., Pardalos, P.M., Adjiman, C.S., Esposito, W.R., Gümüs, Z.H., Harding, S.T., Klepeis, J.L., Meyer, C.A., Schweiger, C.A.: Handbook of Test Problems in Local and Global Optimization. Kluwer, Boston, Mass, USA (1999)

    Book  Google Scholar 

  45. Mezura-Montes, E., Coello, C.A.C.: Useful infeasible solutions in engineering optimization with evolutionary algorithms. In: Gelbukh, A., de Albornoz, Á., Terashima-Marín, H. (eds.) MICAI 2005: Advances in Artificial Intelligence. MICAI’2005. Lecture Notes in Computer Science, vol. 3789. Springer, Berlin, Heidelberg

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Seema Agrawal .

Editor information

Editors and Affiliations

Appendix

Appendix

1.1 Constrained Problems

The first six functions are well-known benchmark taken from Runarsson and Yao [43].

Problem 01

$$ \begin{array}{*{20}l} {\text{Minimize}} \hfill & {\text{f} \left( {\text{x}} \right) = 5\sum\limits_{{{\text{i}} = 1}}^{4} {{\text{x}}_{\text{i}} - 5} \sum\limits_{{{\text{i}} = 1}}^{4} {{\text{x}}_{{_{\text{i}} }}^{2} - } \sum\limits_{{{\text{i}} = 5}}^{13} {{\text{x}}_{\text{i}} } } \hfill \\ {{\text{subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = 2{\text{x}}_{1} + 2{\text{x}}_{2} + {\text{x}}_{10} + {\text{x}}_{11} - 10 \le 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = 2{\text{x}}_{1} + 2{\text{x}}_{3} + {\text{x}}_{10} + {\text{x}}_{12} - 10 \le 0} \hfill \\ {} \hfill & {\text{g}_{3} \left( {\text{x}} \right) = 2{\text{x}}_{2} + 2{\text{x}}_{3} + {\text{x}}_{11} + {\text{x}}_{12} - 10 \le 0} \hfill \\ {} \hfill & {\text{g}_{4} \left( {\text{x}} \right) = - 8{\text{x}}_{1} + {\text{x}}_{10} \le 0} \hfill \\ {} \hfill & {\text{g}_{5} \left( {\text{x}} \right) = - 8{\text{x}}_{2} + {\text{x}}_{11} \le 0} \hfill \\ {} \hfill & {\text{g}_{6} \left( {\text{x}} \right) = - 8{\text{x}}_{3} + {\text{x}}_{12} \le 0} \hfill \\ {} \hfill & {\text{g}_{7} \left( {\text{x}} \right) = - 2{\text{x}}_{4} - {\text{x}}_{5} + {\text{x}}_{10} \le 0} \hfill \\ {} \hfill & {\text{g}_{8} \left( {\text{x}} \right) = - 2{\text{x}}_{6} - {\text{x}}_{7} + {\text{x}}_{11} \le 0} \hfill \\ {} \hfill & {\text{g}_{9} \left( {\text{x}} \right) = - 2{\text{x}}_{8} - {\text{x}}_{9} + {\text{x}}_{12} \le 0} \hfill \\ \end{array} $$

\( 0 \le {\text{x}}_{\text{i}} \le 1 \), i = 1, .., 9; \( 0 \le {\text{x}}_{\text{i}} \le 100 \), i = 10, 11, 12; \( 0 \le {\text{x}}_{13} \le 1 \).

The global minimum optimal solution is:

$$ {\bar{\text{x}}} = \left( {1,1,1,1,1,1,1,1,1,3,3,3,1} \right),f\left( {{\bar{\text{x}}}} \right) = - 15 $$

Problem 02

$$ \begin{array}{*{20}l} {\text{Maximize }} \hfill & {{\text{f}}\left( {\text{x}} \right) = \left| {\frac{{\sum\limits_{{{\text{i}} = 1}}^{\text{n}} {{ \cos }^{4} \left( {{\text{x}}_{\text{i}} } \right) - 2\prod\limits_{{{\text{i}} = 1}}^{\text{n}} {{ \cos }^{2} \left( {{\text{x}}_{\text{i}} } \right)} } }}{{\sqrt {\sum\limits_{{{\text{i}} = 1}}^{\text{n}} {{\text{ix}}_{\text{i}}^{2} } } }}} \right|} \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {{\text{g}}_{1} \left( {\text{x}} \right) = 0.75 - \prod\limits_{{{\text{i}} = 1}}^{\text{n}} {{\text{x}}_{\text{i}} } \le 0} \hfill \\ {} \hfill & {{\text{g}}_{2} \left( {\text{x}} \right) = \sum\limits_{{{\text{i}} = 1}}^{\text{n}} {{\text{x}}_{\text{i}} } - 7.5\,{\text{n}} \le 0} \hfill \\ \end{array} $$

where n = 20 and \( 0 \le {\text{x}}_{\text{i}} \le 10 \), i = 1, …, n.

The global maximum optimal solution is:

$$ f\left( {\bar{x}} \right) = 0.803619. $$

Problem 03

$$ \begin{array}{*{20}l} {\text{Maximize}} \hfill & {\text{f} \left( {\text{x}} \right) = \left( {\sqrt {\text{n}} } \right)^{\text{n}} \prod\limits_{{{\text{i}} = 1}}^{\text{n}} {{\text{x}}_{\text{i}} } } \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{h} \left( {\text{x}} \right) = \sum\limits_{{{\text{i}} = 1}}^{\text{n}} {{\text{x}}_{\text{i}}^{2} } - 1 = 0} \hfill \\ \end{array} $$

where n= 10 and \( 0 \le {\text{x}}_{\text{i}} \le 1 \), i = 1, …, n.

The global maximum optimal solution is:

$$ \bar{x} = \left( {\frac{1}{\sqrt n }, \ldots ,\frac{1}{\sqrt n }} \right),f\left( {\bar{x}} \right) = 1. $$

Problem 04

$$ \begin{array}{*{20}l} {Minimize} \hfill & {\text{f} \left( {\text{x}} \right) = 5.3578547\,{\text{x}}_{3}^{2} + 0.8356891\,{\text{x}}_{1} {\text{x}}_{5} + 37.293239\,{\text{x}}_{1} - 40792.141} \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = 85.334407 + 0.0056858\,{\text{x}}_{2} {\text{x}}_{5} + 0.0006262\,{\text{x}}_{1} {\text{x}}_{4} - 0.0022053\,{\text{x}}_{3} {\text{x}}_{5} - 92 \le 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = - 85.334407 - 0.0056858\,{\text{x}}_{2} {\text{x}}_{5} - 0.0006262\,{\text{x}}_{1} {\text{x}}_{4} + 0.0022053\,{\text{x}}_{3} {\text{x}}_{5} \le 0} \hfill \\ {} \hfill & {\text{g}_{3} \left( {\text{x}} \right) = 80.51249 + 0.0071317\,{\text{x}}_{2} {\text{x}}_{5} + 0.0029955\,{\text{x}}_{1} {\text{x}}_{2} + 0.0021813\,{\text{x}}_{3}^{2} - 110 \le 0} \hfill \\ {} \hfill & {\text{g}_{4} \left( {\text{x}} \right) = - 80.51249 - 0.0071317\,{\text{x}}_{2} {\text{x}}_{5} - 0.0029955\,{\text{x}}_{1} {\text{x}}_{2} - 0.0021813\,{\text{x}}_{3}^{2} + 90 \le 0} \hfill \\ {} \hfill & {\text{g}_{5} \left( {\text{x}} \right) = 9.300961 + 0.0047026\,{\text{x}}_{3} {\text{x}}_{5} + 0.0012547\,{\text{x}}_{1} {\text{x}}_{3} + 0.0019085\,{\text{x}}_{3} {\text{x}}_{4} - 25 \le 0} \hfill \\ {} \hfill & {\text{g}_{6} \left( {\text{x}} \right) = - 9.300961 - 0.0047026\,{\text{x}}_{3} {\text{x}}_{5} - 0.0012547\,{\text{x}}_{1} {\text{x}}_{3} - 0.0019085\,{\text{x}}_{3} {\text{x}}_{4} + 20 \le 0} \hfill \\ {} \hfill & {78 \le {\text{x}}_{1} \le 102,33 \le {\text{x}}_{2} \le 45,27 \le {\text{x}}_{\text{i}} \le 45,\,{\text{i}} = 3,4,5.} \hfill \\ \end{array} $$

The global minimum optimal solution is:

$$ {\bar{\text{x}}} = \left( {78.0,33.0,29.995256025682,45.0,36.775812905788} \right),f\left( {{\bar{\text{x}}}} \right) = - 30665.539. $$

Problem 05

$$ \begin{array}{*{20}l} {\text{Maximize}} \hfill & {\text{f} \left( {\text{x}} \right) = \frac{{{ \sin }^{3} \left( {2\pi \,{\text{x}}_{1} } \right){ \sin }\left( {2\pi \,{\text{x}}_{2} } \right)}}{{{\text{x}}_{1}^{3} \left( {{\text{x}}_{1} + {\text{x}}_{2} } \right)}}} \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = {\text{x}}_{1}^{2} - {\text{x}}_{2} + 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = 1 - {\text{x}}_{1} + \left( {{\text{x}}_{2} - 4} \right)^{2} \le 0} \hfill \\ {} \hfill & {0 \le {\text{x}}_{\text{i}} \le 10,\,\,\,\,\,{\text{i}} = 1,2.} \hfill \\ \end{array} $$

The global maximum optimal solution is:

$$ \bar{x} = \left( {1.2279713,4.2453733} \right),f\left( {\bar{x}} \right) = 0.095825. $$

Problem 06

$$ \begin{array}{*{20}l} {\text{Minimize}} \hfill & {\text{f} \left( {\text{x}} \right) = {\text{x}}_{1}^{2} + \left( {{\text{x}}_{2} - 1} \right)^{2} } \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{h} \left( {\text{x}} \right) = {\text{x}}_{2} - {\text{x}}_{1}^{2} = 0} \hfill \\ {} \hfill & { - 1 \le {\text{x}}_{\text{i}} \le 1,\,\,\,\,\,{\text{i}} = 1,2.} \hfill \\ \end{array} $$

The global minimum optimal solution is:

$$ \bar{x} = \left( { \pm \frac{1}{\sqrt 2 },\frac{1}{2}} \right),f\left( {\bar{x}} \right) = 0.75. $$

Problem 07 This problem has been taken from handbook of test problems in local and global optimizations [44].

$$ \begin{array}{*{20}l} {\text{Minimize}} \hfill & {\text{f} \left( {\text{x}} \right) = \left( { - {\text{x}}_{1} - {\text{x}}_{5} + 0.4\,{\text{x}}_{1}^{0.67} {\text{x}}_{3}^{ - 0.67} + 0.4\,{\text{x}}_{5}^{0.67} {\text{x}}_{7}^{ - 0.67} } \right)} \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = 0.05882\,{\text{x}}_{3} \,{\text{x}}_{4} + 0.1\,{\text{x}}_{1} \le 1} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = 0.05882\,{\text{x}}_{7} \,{\text{x}}_{8} + 0.1\,{\text{x}}_{1} + 0.1\,{\text{x}}_{5} \le 1} \hfill \\ {} \hfill & {\text{g}_{3} \left( {\text{x}} \right) = 4\,{\text{x}}_{2\,} {\text{x}}_{4}^{ - 1} + 2\,{\text{x}}_{2}^{ - 0.71} {\text{x}}_{4}^{ - 1} + 0.05882\,{\text{x}}_{2}^{ - 1.3} {\text{x}}_{3} \le 1} \hfill \\ {} \hfill & {\text{g}_{4} \left( {\text{x}} \right) = 4\,{\text{x}}_{6\,} {\text{x}}_{8}^{ - 1} + 2\,{\text{x}}_{6}^{ - 0.71} {\text{x}}_{8}^{ - 1} + 0.05882\,{\text{x}}_{6}^{ - 1.3} {\text{x}}_{7} \le 1} \hfill \\ {} \hfill & {0.01 \le {\text{x}}_{\text{i}} \le 10,\,{\text{i}} = 1,2, \ldots ,8.} \hfill \\ \end{array} $$

The optimal solution of the problem is:

$$ \begin{aligned} \bar{x} & = \left( {6.4225,\,0.6686,\,1.0239,\,5.9399,\,2.2673,\,0.5960,\,0.4029,5.5288} \right) \\ f\left( {\bar{x}} \right) & = - 6.0482. \\ \end{aligned} $$

Problem 08 This problem was proposed by Mezura and Coello [45].

$$ \begin{array}{*{20}l} {\text{Minimize}} \hfill & {\text{f} \left( {\text{x}} \right) = - {\text{x}}_{ 1} - {\text{x}}_{2} } \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = - 2\,{\text{x}}_{1}^{4} + 8\,{\text{x}}_{1}^{3} - 8\,{\text{x}}_{1}^{2} + {\text{x}}_{2} - 2 \le 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = - 4\,{\text{x}}_{1}^{4} + 32\,{\text{x}}_{1}^{3} - 88\,{\text{x}}_{1}^{2} + 96\,{\text{x}}_{1} + {\text{x}}_{2} - 36 \le 0} \hfill \\ {} \hfill & {0 \le {\text{x}}_{1} \le 3,0 \le {\text{x}}_{2} \le 4} \hfill \\ \end{array} $$

The global minimum solution is:

$$ \bar{x} = \left( {2.3295,3.17846} \right),f\left( {\bar{x}} \right) = - 5.50796. $$

Problem 09

$$ \begin{array}{*{20}l} {\text{Minimize}} \hfill & {\text{f} \left( {\text{x}} \right) = 5.3578\,{\text{x}}_{3}^{2} + 0.8357\,{\text{x}}_{1} {\text{x}}_{5} + 37.2392\,{\text{x}}_{1} } \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = 0.00002584\,{\text{x}}_{3} {\text{x}}_{5} - 0.0006663\,{\text{x}}_{2} {\text{x}}_{5} - 0.0000734\,{\text{x}}_{1} {\text{x}}_{4} - 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = 0.000853007\,{\text{x}}_{2} {\text{x}}_{5} + 0.00009395\,{\text{x}}_{1} {\text{x}}_{4} - 0.00033085\,{\text{x}}_{3} - 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{3} \left( {\text{x}} \right) = 0.00024186\,{\text{x}}_{2} {\text{x}}_{5} + 0.00010159\,{\text{x}}_{1} {\text{x}}_{2} + 0.00007379\,{\text{x}}_{3}^{2} \, - 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{4} \left( x \right) = 1330.3294\,{\text{x}}_{2}^{ - 1} {\text{x}}_{5}^{ - 1} - 0.42\,{\text{x}}_{1} {\text{x}}_{5}^{ - 1} - 0.30586\,{\text{x}}_{2}^{ - 1} {\text{x}}_{3}^{2} {\text{x}}_{5}^{ - 1} - 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{5} \left( {\text{x}} \right) = 2275.1327\,{\text{x}}_{3}^{ - 1} {\text{x}}_{5}^{ - 1} - 0.2668\,{\text{x}}_{1} {\text{x}}_{5}^{ - 1} - 0.40584\,{\text{x}}_{4} {\text{x}}_{5}^{ - 1} - 1 \le 0} \hfill \\ {} \hfill & {\text{g}_{6} \left( {\text{x}} \right) = 0.00029955\,{\text{x}}_{3} {\text{x}}_{5} + 0.00007992\,{\text{x}}_{1} {\text{x}}_{3} + 0.00012157\,{\text{x}}_{3} {\text{x}}_{4} - 1 \le 0} \hfill \\ {} \hfill & {78 \le {\text{x}}_{1} \le 102,33 \le {\text{x}}_{2} \le 45,27 \le {\text{x}}_{\text{i}} \le 45,\,{\text{i}} = 3,4,5.} \hfill \\ \end{array} $$

The global minimum optimal solution is:

$$ \bar{x} = \left( {78.0,33.0,29.998,45.0,36.7673} \right),f\left( {\bar{x}} \right) = 10122.6964. $$

Problem 10

$$ \begin{array}{*{20}l} {{\text{Minimize}} } \hfill & {\text{f} \left( {\text{x}} \right) = \left( {{\text{x}}_{1}^{2} + {\text{x}}_{2} - 11} \right)^{2} + \left( {{\text{x}}_{1} + {\text{x}}_{2}^{2} - 7} \right)^{2} } \hfill \\ {{\text{Subject}}\,{\text{to:}}} \hfill & {\text{g}_{1} \left( {\text{x}} \right) = 4.84 - \left( {{\text{x}}_{1} - 0.05} \right)^{2} - \left( {{\text{x}}_{2} - 2.5} \right)^{2} \ge 0} \hfill \\ {} \hfill & {\text{g}_{2} \left( {\text{x}} \right) = {\text{x}}_{1}^{2} + \left( {{\text{x}}_{2} - 2.5} \right)^{2} - 4.84 \ge 0} \hfill \\ {} \hfill & {0 \le {\text{x}}_{\text{i}} \le 6\,,{\text{i}} = 1,2.} \hfill \\ \end{array} $$

The global minimum optimal solution is:

$$ \bar{x} = \left( {2.246826,2.381865} \right),f\left( {\bar{x}} \right) = 13.59085. $$

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Singh, D., Agrawal, S. (2019). Fine-Tuned Constrained Nelder–Mead SOMA. In: Bansal, J., Das, K., Nagar, A., Deep, K., Ojha, A. (eds) Soft Computing for Problem Solving. Advances in Intelligent Systems and Computing, vol 817. Springer, Singapore. https://doi.org/10.1007/978-981-13-1595-4_44

Download citation

Publish with us

Policies and ethics