Skip to main content
Log in

Artificial immune optimization system solving constrained omni-optimization

  • Research Paper
  • Published:
Evolutionary Intelligence Aims and scope Submit manuscript

Abstract

This work investigates an artificial immune optimization system suitable for single and multi-objective constrained optimization. In this optimizer, an evaluation index, which can decide the importance of individual in the current population, is developed to accelerate population division; the niching-like proliferation scheme is introduced to strengthen the diversity of population. Thereafter, those diverse antibodies, with the help of immune evolution operations, evolve their structures along different directions. Theoretical results show that such optimization system is convergent with low computational complexity. Experimentally, one such optimizer is sufficiently examined by a suite of single and multi-objective test problems. Comparative experiments illustrate that the optimizer with some striking characteristics is a potentially alternative optimization tool for constrained omni-optimization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Zitzler E, Thiele L (1999) Evolution algorithms for multiobjective optimization: methods and application. Doctoral thesis ETH No. 133398. Swiss Federal Institute of Technology, Zurich

  2. Leung YW, Wang YP (2000) Multiobjective programming using uniform design and genetic algorithm. IEEE Trans Syst Man Cybern C Appl Rev 30(3):293–304

    Article  Google Scholar 

  3. Deb K, Agrawal S, Pratap A et al (2002) A fast elitist nondominated sorting genetic algorithm for multi-objective optimization: NSGA-II. IEEE Trans Evol Comput 6:182–197

    Article  Google Scholar 

  4. Aguirre AH, Rionda SB, Coello CA Coello et al (2004) Handling constraints using multiobjective optimization concepts. Int J Numer Methods Eng 59(15):1989–2017

    Article  MATH  Google Scholar 

  5. Arumugama MS, Rao MVC, Palaniappan R (2005) New hybrid genetic operators for real coded genetic algorithm to compute optimal control of a class of hybrid systems. Appl Soft Comput 6(1):38–52

    Article  Google Scholar 

  6. Kaya M (2009) MOGAMOD: multi-objective genetic algorithm for motif discovery. Expert Syst Appl 36(2):1039–1047

    Article  Google Scholar 

  7. Deb K, Tiwari S (2005) Omni-optimizer: A procedure for single and multi-objective optimization. In: Coello Coello CA , Aguirre AH, Zitzler E (eds) Proceedings of 3rd. EMO, LNCS , Mexico, vol 3410, pp 47–61

  8. Klanac A, Jelovica J (2007) A concept of omni-optimization for ship structural design. Advancements in Marine Structures.In: Guedes Soares and Das (eds) Proceedings of MARSTRUCT 2007, the 1st international conference on marine structures, glasgow, pp 473–481

  9. Klanac A, Jelovica J, Niemelänen M et al. Structural omni-optimization of a tanker. COMPIT’2008, Liege

  10. Palonen M, Hasan A, Siren K (2009) A genetic algorithm for optimization of building envelope and HVAC system parameters. Eleventh international IBPSA conference

  11. Timmis J, Knight T, De Castro LN et al An overview of artificial immune systems. In: Paton R, Bolouri H, Holcombe M (eds) Computation in cells and tissues: perspectives and tools for thought, natural computation series.. pp 51–86 Springer, Berlin, (2004)

    Google Scholar 

  12. Huang XY, Zhang ZH, He CJ et al (2005) Mordern intelligent algorithms: theory and applications. Science Press, Beijing

    Google Scholar 

  13. Campelo F, Guimarães FG, Igarashi H (2007) Overview of artificial immune systems for multi-objective optimization. Proceedings of the 4th international conference on evolutionary multi-criterion optimization, Matsushima, pp 937–951

  14. Hart E, Timmis J (2008) Application areas of AIS: the past, present and the puture. J Appl Soft Comput 8(1):191–201

    Article  Google Scholar 

  15. Timmis J, Honec A, Stibord T et al (2008) Theoretical advances in artificial immune systems. Theor Comput Sci 403(1):11–32

    Article  MATH  Google Scholar 

  16. Jiao LC, Du HF, Gong MG (2006) Immune computing on optimization: learning and recognition. Science press, Beijing

    Google Scholar 

  17. Su ZG, Wang PH, Yu XJ (2010) Immune genetic algorithm-based adaptive evidential model for estimating unmeasured parameter: estimating levels of coal powder filling in ball mill. Exp Syst Appl 37(7):5246–5258

    Article  Google Scholar 

  18. Aragón VS, Esquivel SC (2008) Optimizing constrained problems through a T-cell artificial immune system. J Comput Sci Technol 8(3):158–165

    Google Scholar 

  19. Cao XB, Qiao H, Xua YW (2007) Negative selection based immune optimization. Adv Eng Softw 38(10):649–656

    Article  Google Scholar 

  20. Woldemariam KM, Yen GG (2010) Vaccine enhanced artificial immune system for multimodal function optimization. IEEE Trans Syst Man Cybern B Cybern 40(1):218–228

    Article  Google Scholar 

  21. de Castro LN, Von Zuben FJ (2000) The clonal selection algorithm with engineering applications. In: workshop proceedings of GECCO, workshop on artificial immune systems and their applications, Las Vegas, pp 36–37

  22. de Castro LN, Timmis J (2002) Artificial immune systems: a new computational intelligence approach. Springer, Berlin

    MATH  Google Scholar 

  23. Cutello V, Narzisi,G Nicosia G et al. (2006) Real coded clonal selection algorithm for global numerical optimization using a new inversely proportional hypermutation operator. The 21st annual ACM symposium on applied computing, SAC 2006. ACM Press 2, Dijon, pp 950–954

  24. Coello CA~Coello (2005) Solving multiobjective optimization problems using an aritificial immune system. Genetic programming and evolvable machine, pp. 163–190

  25. Brownlee J (2006) IIDLE: an immunological inspired distributed learning environment for multiple objective and hybrid optimisation. 2006 IEEE congress on evolutionary computation, sheraton vancouver wall centre hotel, Vancouver

  26. Freschi F, Coello Coello CA, Repetto M Multiobjective optimization and artificial immune system: A review. http://www.igi-global.com/downloads/excerpts/33155.pdf

  27. Campelo F, Guimaraes FG, Igarashi H (2007) Overview of artificial immune systems for multi-objective optimization. In: Obayashi S et al (eds) EMO 2007, LNCS 4403, pp 937–951

  28. Gong MG, Jiao LC, Du HF et al (2008) Multiobjective immune algorithm with nondominated neighbor-based selection. Evol Comput 16(2):225–255

    Article  Google Scholar 

  29. Omkar SN, Khandelwal R, Yathindra S et al (2008) Artificial immune system for multi-objective design optimization of composite structures. Eng Appl Artif Intell 21(8):1416–1429

    Article  Google Scholar 

  30. Tan KC, Goh CK, Mamun AA et al (2008) An evolutionary artificial immune system for multi-objective optimization. Eur J Oper Res 187(2):371–392

    Article  MathSciNet  MATH  Google Scholar 

  31. Zhang ZH (2007) Immune optimization algorithm for constrained nonlinear multiobjective optimization problems. Appl Soft Comput 7(3):840–857

    Article  Google Scholar 

  32. Xiao HS, Zu JA (2007) A new constrained multiobjective optimization algorithm based on artificial immune systems. 2007 international conference on mechatronics and automation, Harbin, pp 3122–3127

  33. Hong L (2009) An adaptive multi-objective immune optimization algorithm. 2009 IITA international conference on control, automation and systems engineering, pp 140–143

  34. Aydin I, Karakose M, Akin E A multi-objective artificial immune algorithm for parameter optimization in support vector machine. Appl Soft Comput (Available online)

  35. Chen JY, Lin QZ, Ji Z (2010) A hybrid immune multiobjective optimization algorithm. Eur J Oper Res 204(2):294–302

    Article  MathSciNet  MATH  Google Scholar 

  36. Hu ZH (2010) A multiobjective immune algorithm based on a multiple-affinity model. Eur J Oper Res 202(1):60–72

    Article  MATH  Google Scholar 

  37. Gao JQ, Wang J (2010) WBMOAIS: a novel artificial immune system for multiobjective optimization. Comput Oper Res 37(1):50–61

    Article  MathSciNet  MATH  Google Scholar 

  38. de Castro LN, Von Zuben FJ aiNet: an artificial immune network for data analysis. In: Abbass HA, Sarker RA, Newton CS (eds) Data mining: a heuristic approach, Chapter XII.. pp 231–259 Idea Group Publishing, USA, (2001)

    Chapter  Google Scholar 

  39. de Castro LN, Timmis J (2002) An artificial immune network for multimodal function optimization. Proceedings of IEEE world congress on evolutionary computation. pp 669–674

  40. Gomes LCT, de Sousa JS, Bezerra GB et al (2003) dopt-aiNet and the gene ordering problem. In: Information technology magazine 3(2):27–33

    Google Scholar 

  41. Coelho GP, Von Zuben FJ (2006) omni-aiNet: an immune-inspired approach for omni-optimization. In: Bersini H, Carneiro J (eds) ICARIS 2006, LNCS 4163, pp 294–308

  42. Oyama A, Shimoyama K, Fujii K et al (2005) New constraint-handling method for multiobjective multiconstraint evo-lutionary optimization and its application to space plane design. In: Schilling R, Haase W, Periaux J (eds) Evolutionary and deterministic methods for design, optimization and control with applications to industrial and societal problems (Eurogen 2005), pp 1–13 Germany

    Google Scholar 

  43. Van Veldhuizen DA (1999) Multi-objective evolutionary algorithms: classifications, analysis, and new innovation. Elec Comput Eng Air Force Institute of Technology, Wright-Patterson AFB, Dayton

  44. Mallipeddi R, Suganthan PN (2006) Problem definitions and evaluation criteria for the CEC 2006 special session on constrained real-rarameter optimization. Kanpur genetic algorithms laboratory (KanGAL), Indian Institute of Technology, Kanpur, PIN 208 016, India, Technical report

  45. Deb K, Pratap A, Meyarivan T (2001) Constrained test problems for multi-objective evolutionary optimization. First international conference on evolutionary multi-criterion optimization. In: Zitzler Eckart E, Deb K, Lothar TL et al (eds) Lecture notes in computer science no. 1993, Springer pp 284–298

  46. Kurpati A, Azarm S, Wu J (2002) Constraint handling improvements for multiobjective genetic algorithms. Struct Multidisc Optim 23:204–213

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported in part by National Natural Science Foundation NSFC (60565002, 61065010), China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhuhong Zhang.

Appendices

Appendix 1: Proof of conclusions

Proof of Theorem 2.1

Assume \({\bf y} \prec_{c}{\bf x}\). First, if both x and y are feasible, one can see L evel (x) < L evel (y), due to |m(x)| > |m(x)|. Second, if y is feasible but x is not, it follows similarly that L evel (x) < L evel (y). Further, if y is not feasible, x is also so because \({\bf y}\prec_{c}{\bf x}\), and accordingly one gets L evel (x) < L evel (y), because of |m(y)| > |m(x)|. These cases illustrate that the conclusion is true. \(\square\)

Proof of Theorem 4.1

Only give the proof of computational complexity for the multi-objective optimization case (M > 1). In step 5, each antibody is required to compute and compare [(N − 1)(M + I + J) + 1] times; step 6 needs to rank those antibodies with N log N2 times. These antibodies are required to enter into corresponding subclasses. This needs (N − 1) times to check which antibodies have the same importance. Therefore, steps 5 and 6 execute to compare and compute a n times,

$$ \begin{aligned} a_{n}&=N[(M+I+J)(N-1)+1] +N\,log_{2}^{N}+N-1\\ &\leq N(N-1)(M+I+J)+N\,log_{2}^{N}+2N. \end{aligned} $$
(13)

Hence, the maximal computational complexity for these two steps is O((M + I + J)N 2). In step 7, since M emory only collects feasible antibodies, this step requires b n comparisons to find at most M e survival memory cells,

$$ \begin{aligned} b_{n}&\leq (M_{e}+|F_{1}|)^{2}+(M_{e}+|F_{1}|)log_{_{2}}^{M_{e}+|F_{1}|}\\ &\leq (M_{e}+N)^{2}+(M_{e}+N)log_{_{2}}^{M_{e}+N}. \end{aligned} $$
(14)

On the right side of Eq. 14, the first term represents the computational times deleting those identical and dominated members in M memory F 1; the second term stands for the number of computations which is used to decide M e memory cells by the crowding distance method. Therefore, the maximal computational complexity of step 7 is O(N 2), due to M e  ≤ N.

In steps 10 to 12, for the j-th subclass F j , since each antibody is required to find its nearest one, the amount of computations and comparisons for step 10 is (M + 1)|F j |(|F j | − 1) + |F j |2 + |F j |. On the other hand, step 11 needs to execute p|F j |p j times. In addition, step 12 evaluates those mutated clones with (M + I + J)|F j | times. Therefore, the amount of computations and comparisons for these steps is given by

$$ \begin{aligned} c_{n}&=\sum_{j=1}^{l}[(M+1)|F_{j}|(|F_{j}|-1)+|F_{j}|^2+|F_{j}|+p|F_{j}|p_{j}+(M+I+J)|F_{j}|]\\ &\leq N[(M+1)(N-1)+N+p+M+I+J+1]. \end{aligned} $$
(15)

Hence, the maximal complexity of such steps is O(N(N(M + I + J) + p)).

In step 15, first divide F 1 ∪ C * into two subclasses: feasible solution set X 1 and infeasible solution set X 2, which needs to carry out |F 1 ∪ C *| times; second, if |X 1| > N, there needs at most (M + I + J)|X 1|2 times to find nondominated antibodies and conversely, the crowding distance method is executed on X 2, for which the total of computations and comparisons is \(M|X_{2}|log_{2}^{|X_{2}|}\). Thus, since |X 1| and |X 2| are not larger than N, this step needs at most d n times to compute or compare those antibodies,

$$ \begin{aligned} d_{n}&={\rm max}\{(M+I+J)|X_{1}|^2, M|X_{2}|log_{_{2}}^{|X_{2}|}\}\\ &\leq (M+I+J)N^2. \end{aligned} $$
(16)

Summarily, because of M e  ≤ N, the overall worst case complexity of Omni-AIOS is

$$ \begin{aligned} O(B)&=O((M+I+J)N^{2})+O(MN^{2})+O(N(N(M+I+J)+p))\\ &\quad+O((M+I+J)N^2)\approx O(N(N(M+I+J)+p)). \end{aligned} $$
(17)

\(\square\)

Proof of Theorem 4.2

Given \(X\in S^{\leq N}\), through steps 6 and 10 there exists \((X_{j},X^{\prime}_{j})\in S^{\leq N}\times S^{\leq N}\) satisfying

$$ P\{F_{j}=X_{j}|A_{n}=X\}=1, P\{F_{j}^{\prime}=X_{j}^{\prime}|F_{j}=X_{j}\}=1. $$
(18)

Further, through the mutation operator and for \(\forall {\bf x}\in X\) and \({\bf y}\in S\), there exists some j with \({\bf x}\in F_{j}\). In such case, x can attain y, provided that μ is decided reasonably as in (8). Thus, P{mutate(x) = y} > 0, where mutate(x) is obtained by x through the polynomial mutation. Hence, for \(Z_{j}\in S^{\leq N}\) with |Z j | = |X j |, one can imply that P{C * j  = Z j |F j  = X j } > 0. So, it is necessary to hold the formula for \(\forall X,Z\in S^{\leq N}\),

$$ \begin{aligned} P\{C^{\ast}=Z|A_{n}=X\}&=\prod _{j\geq 1}P\{C^{\ast}_{j}=Z_{j}|F_{j}=X_{j}\}\\ &=\prod _{j\geq 1}P\{C^{\ast}_{j}=Z_{j}|F_{j}^{\prime}=X_{j}^{\prime}\}>0. \end{aligned} $$
(19)

Further, since the series of {A n } n ≥ 1 is monotone, the above conclusion is true, relying upon Theorem 4 [43]. \(\square\)

Appendix 2: Test problems

This section presents eight popular single-objective, uni-global constrained benchmark problems g01 to g08 [44], one single-objective, multi-global problem G01 [7], eight popular multi-objective, uni-global benchmark problems CPT1 to CPT7 and SPR [45, 46], and one multi-objective, multi-global problem G02 [7].

$$ \begin{aligned} &g01.\, {\rm Min}\, f(x) = 5\sum_{i=1}^{4}x_{i}-5\sum_{i=1}^{4} x_{i}^{2} -\sum_{i=5}^{13} x_{i}\\ &\hbox{s.t.,}\\ &g_{1}(x) = 2x_{1} + 2x_{2} + x_{10} + x_{11}-10\leq 0,\\ &g_{2}(x) = 2x_{1} + 2x_{3} + x_{10} + x_{12}-10\leq 0,\\ &g_{3}(x) = 2x_{2} + 2x_{3} + x_{11} + x_{12}-10\leq 0,\\ &g_{4}(x) = -8x_{1} + x_{10}\leq 0,\\ &g_{5}(x) = -8x_{2} + x_{11}\leq 0,\\ &g_{6}(x) = -8x_{3} + x_{12}\leq 0,\\ &g_{7}(x) = -2x_{4} -x_{5} + x_{10}\leq 0,\\ &g_{8}(x) = -2x_{6} -x_{7} + x_{11}\leq 0,\\ &g_{9}(x) = -2x_{8} -x_{9} + x_{12}\leq 0,\\ &0 \leq x_{i}\leq 1,\;1\leq i\leq 9,\; 0\leq x_{i}\leq 00,\; 10\leq i\leq 12,\; 0\leq x_{13}\leq 1. \end{aligned} $$
$$ \begin{aligned} &g02.\, {\rm Min}\, f(x) =-\left|\frac{\sum\nolimits_{i=1}^{n}cos^{4}(x_{i}) -2\prod\nolimits_{i=1}^{n}cos^{2}(x_{i})} {\sqrt{\sum\nolimits_{i=1}^{n}ix_{i}^{2}}}\right|\\ &\hbox{s.t.,}\\ &g_{1}(x) = 0.75-\prod_{i=1}^{n}x_{i}\leq 0, g_{2}(x) = \sum_{i=1}^{n}x_{i}-7.5n\leq 0,\\ &n=20, 0 < x_{i}\leq 10, 1\leq i \leq n. \end{aligned} $$
$$ \begin{aligned} &{g03.\, {\rm Min}\, f(x) =-(\sqrt{n})^{n}\prod_{i=1}^{n}x_{i}}\\ &\hbox{s.t.,}\\ &h_{1}(x) =\sum_{i=1}^{n}x_{i}^{2}-1 = 0,\\ &n=20, \; 0\leq x_{i}\leq 1, \; 1\leq i\leq n. \end{aligned} $$
$$ \begin{aligned} &g04.\, {\rm Min}\, f(x) = 5.3578547x_{3}^{2}+ 0.8356891x_{1}x_{5} + 37.293239x_{1}-40792.141\\ &\hbox{s.t.,}\\ &g_{1}(x) = 85.334407 + 0.0056858x_{2}x_{5} + 0.0006262x_{1}x_{4} -0.0022053x_{3}x_{5}-92\leq 0,\\ &g_{2}(x) =-5.334407-0.0056858x_{2}x_{5}-0.0006262x_{1}x_{4} + 0.0022053x_{3}x_{5}\leq 0,\\ &g_{3}(x) = 80.51249 + 0.0071317x_{2}x_{5} + 0.0029955x_{1}x_{2} + 0.0021813x_{3}^{2}-110\leq 0,\\ &g_{4}(x) = -0.51249-0.0071317x_{2}x_{5}-0.0029955x_{1}x_{2}-0.0021813x_{3}^{2}+90\leq 0,\\ &g_{5}(x) = 9.300961 + 0.0047026x_{3}x_{5} + 0.0012547x_{1}x_{3} + 0.0019085x_{3}x_{4}-25\leq 0,\\ &g_{6}(x) =-9.300961-0.0047026x_{3}x_{5}-0.0012547x_{1}x_{3}-0.0019085x_{3}x_{4} + 20\leq 0,\\ &78\leq x_{1}\leq 102,\; 33\leq x_{2}\leq 45,\; 27\leq x_{i}\leq 45 (3\leq i \leq 5). \end{aligned} $$
$$ \begin{aligned} &g05.\, {\rm Min}\, f(x) = 3x_{1} + 0.000001x_{1}^{3}+ 2x_{2} + (0.000002/3)x_{2}^{3}\\ &\hbox{s.t.,}\\ &g_{1}(x) =-x_{4} + x_{3}-0.55\leq 0, g_{2}(x) = -x_{3} + x_{4}-0.55\leq 0,\\ &h_{1}(x) = 1000 sin(-x_{3}-0.25) + 1000 sin(-x_{4}-0.25) + 894.8-x1 = 0,\\ &h_{2}(x) = 1000 sin(x_{3}-0.25) + 1000 sin(x_{3}-x_{4}-0.25) + 894.8-x_{2} = 0,\\ &h_{3}(x) = 1000 sin(x_{4} -0.25) + 1000 sin(x_{4}-x_{3}-0.25) + 1294.8 = 0,\\ &0\leq x_{1}\leq 1200, 0\leq x_{2}\leq 1200, \; -0.55\leq x_{3}\leq 0.55,\; -0.55\leq x_{4}\leq 0.55. \end{aligned} $$
$$ \begin{aligned} &g06.\, {\rm Min}\, f(x) = (x_{1}-10)^{3}+(x_{2} -20)^{3}\\ &\hbox{s.t.,}\\ &g_{1}(x) = -(x_{1}-5)^{2}-(x_{2}-5)^{2}+ 100\leq 0,\\ &g_{2}(x) = (x_{1}-6)^{2} + (x_{2}-5)^{2}-82.81\leq 0,\\ &13\leq x_{1}\leq 100,\; 0\leq x_{2}\leq 100. \end{aligned} $$
$$ \begin{aligned} &g07.\, {\rm Min}\, f(x) = x_{1}{1}^{2}+ x_{2}^{ 2} + x_{1}x_{2}-14x_{1}-16x_{2} + (x_{3}-10)^{2} + 4(x_{4}-5)^{2} + (x_{5}-3)^{2}+2(x_{6}-1)^{2} + 5x^{2}_{7} + 7(x_{8}-11)^{2} + 2(x_{9}-10)^{2} + (x_{10}-7)^{2} + 45\\ &\hbox{s.t.,}\\ &g_{1}(x) =-105 + 4x_{1} + 5x_{2}-3x_{7} + 9x_{8}\leq 0,\\ &g_{2}(x) = 10x_{1}-8x_{2}-17x_{7} + 2x_{8}\leq 0,\\ &g_{3}(x) =-8x_{1} + 2x_{2} + 5x_{9}-2x_{10}-12 \leq 0,\\ &g_{4}(x) = 3(x_{1} -2)^{2} + 4(x_{2}-3)^{2} + 2x^{2}_{ 3}-7x_{4} -120\leq 0,\\ &g_{5}(x) = 5x^{2}_{1} + 8x_{2} + (x_{3}-6)^{2}-2x_{4} -40\leq 0,\\ &g_{6}(x) = x^{2}_{1} + 2(x_{2}-2)^{2}-2x_{1}x_{2} + 14x_{5}-6x_{6}\leq 0,\\ &g_{7}(x) = 0.5(x_{1}-8)^{2} + 2(x_{2}-4)^{2} + 3x^{2}_{5}-x_{6}-30 \leq 0,\\ &g_{8}(x) = -3x_{1} + 6x_{2} + 12(x_{9}-8)^{2}-7x_{10}\leq 0, \\ & -10\leq x_{i}\leq 10,\; 1\leq i\leq 10. \end{aligned} $$
$$ \begin{aligned} &g08.\, {\rm Min}\, f(x) = -\frac{sin^{3}(2\pi x_{1}) sin(2\pi x_{2})}{x^{3}_{1}(x_{1} + x_{2})}\\ &\hbox{s.t.,}\\ &g_{1}(x) = x^{2}_{1}-x_{2} + 1\leq 0,\\ &g_{2}(x) = 1-x_{1} + (x_{2}-4)^{2}\leq 0,\\ &0\leq x_{1},\;x_{2}\leq 10. \end{aligned} $$
$$ G01. \, {\rm Min}\, f(x) =sin(\pi x_{1})cos(\pi x_{2}) \\ 0\leq x_{1}, x_{2} \leq 40. $$
$$ \begin{aligned} &CPT1.\, {\rm Min}\, f(x) =(f_{1}(x), f(x))\\ &\hbox{s.t.,}\\ &f_{1}(x)=x_{1}, f_{2}(x)=g(x)exp(-\frac{f_{1}(x)}{g(x)}),\\ &c_{j}(x)=f_{2}(x)-a_{j}exp(-b_{j}f_{1}(x))\geq 0, 1\leq j\leq J.\\ &g({\bf x})=1+\frac{1}{100}\sum_{i=1}^{100}[{x_{i}^{2}-cos(2\pi x_{i})}], x_{i}\in [-5.12,5.12],1\leq i\leq 100. \end{aligned} $$
$$ \begin{aligned} &CPT2-CPT7.\, {\rm Min}\, f(x) =(f_{1}(x), f(x))\\ &\hbox{s.t.,}\\ &f_{1}(x)=x_{1}, f_{2}(x)=g(x)(1-\frac{f_{1}(x)}{g(x)}),\\ &c(x)=cos(\theta)(f_{2}(x)-e)-sin(\theta)f_{1}(x) \geq a|sin(b\pi (sin(\theta)(f_{2}(x)-e)+cos(\theta)f_{1}(x))^{c})|^{d},\\ &g(x)=An+\sum_{i=1}^{n}{x_{i}^{2}-A cos(\omega x_{i})}, A=10,\omega=2\pi, x_{i}\in [-5.12,5.12],1\leq i\leq n, \end{aligned} $$

where g(x) is the same as the function as in CPT1.

$$ \begin{aligned} &Speed\, Reducer (SPR).\, {\rm Min}\, f(x) =(f_{1}(x), f(x))\\ &\hbox{s.t.,}\\ &f_{1}(x)=0.7854x_{1}x_{2}^{2}(\frac{10x_{3}^{2}}{3} +14.933x_{3}-43.0934)-1.508x_{1}(x_{6}^{2}+x_{7}^{2}) +7.477(x_{6}^{3}+x_{7}^{3})+0.7854 (x_{4}x_{6}^{2}+x_{5}x_{7}^{2}),\\ &f_{2}(x)=\frac{\sqrt{(\frac{745x_{4}}{x_{2}x_{3}})^{2}+1.69\times 10^{7}}}{0.1x_{6}^{3}}, g_{1}: \frac{1}{x_{1}x_{2}^{2}x_{3}}-\frac{1}{27}\leq 0,\, g2:\frac{1}{x_{1}x_{2}^{2}x_{3}^{2}}-\frac{1}{397.5}\leq 0,\\ &g3:\frac{x_{4}^{3}}{x_{2}x_{3}x_{6}^{4}}-\frac{1}{1.93}\leq 0,\, g4:\frac{x_{5}^{3}}{x_{2}x_{3}x_{7}^{4}}-\frac{1}{1.93}\leq 0, g5: x_{2}x_{3}-40\leq 0, g6: \frac{x_{1}}{x_{2}}-12\leq 0,\\ &g7:5-\frac{x_{1}}{x_{2}}\leq 0, g8: 1.9-x_{4}+1.5 x_{6}\leq 0, g9:1.9-x_{5}+1.1 x_{7}\leq 0, g10: f_{1}(x)\leq 1300,\\ &g11: \frac{\sqrt{(\frac{745x_{5}}{x_{2}x_{3}})^{2}+1.575\times 10^{8}}}{0.1x_{7}^{3}}\leq 1100, 2.6\leq x_{1}\leq 3.6, 0.7\leq x_{2}\leq 0.8,\\ &17\leq x_{3}\leq 28, 7.3\leq x_{4}\leq 8.3, 7.3\leq x_{5}\leq 8.3, 2.9\leq x_{6}\leq 3.9, 5\leq x_{7}\leq 5.5. \end{aligned} $$
$$ \begin{aligned} &G02.\, {\rm Min}\, {\bf f}({\bf x})=(f_{1}({\bf x}),f_{2}({\bf x})),\\ &\hbox{s.t.,}\\ &f_{1}({\bf x})=\sum_{i=1}^{n}sin(\pi x_{i}), f_{2}({\bf x})=\sum_{i=1}^{n}cos(\pi x_{i}),\\ &n=10, 0\leq x_{i}\leq 6, 1\leq i\leq n. \end{aligned} $$

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhang, Z. Artificial immune optimization system solving constrained omni-optimization. Evol. Intel. 4, 203–218 (2011). https://doi.org/10.1007/s12065-011-0064-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12065-011-0064-1

Keywords

Navigation