Skip to main content
Log in

New evolutionary optimization method based on information sets

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

This paper proposes a new evolutionary learning method without any algorithmic-specific parameters for solving optimization problems. The proposed method gets inspired from the information set concept that seeks to represent the uncertainty in an effort using an entropy function. This method termed as Human Effort For Achieving Goals (HEFAG) comprises two phases: Emulation and boosting phases. In the Emulation phase the outcome of the best achiever is emulated by each contender. The effort associated with the average outcome and best outcome are converted into information values based on the information set. In the Boosting phase the efforts of all contenders are boosted by adding the differential information values of any two randomly chosen contenders. The proposed method is tested on benchmark standard functions and it is found to outperform some well-known evolutionary methods based on the statistical analysis of the experimental results using the Kruskal-Wallis statistical test and Wilcoxon rank sum test.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Yang C, Ji J, Liu J, Yin B (2016) Bacterial foraging optimization using novel chemotaxis and conjugation strategies. Inf Sci 363:72–95

    Article  Google Scholar 

  2. Weyland D (2015) A critical analysis of the harmony search algorithm—How not to solve Sudoku. Oper Res Perspect 2:97–105

    Article  MathSciNet  Google Scholar 

  3. Abedinpourshotorbana H, Hasana S, Shamsuddina SM, Fatimah N (2016) A differential-based harmony search algorithm for the optimization of continuous problems. Expert Syst Appl 62:317–332

    Article  Google Scholar 

  4. Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst, Man, Cybern-Part B 26(1):29–41

    Article  Google Scholar 

  5. Mendes R, Kennedy J, Neves J (2004) The fully informed particle swarm: simpler, maybe better. IEEE Trans Evol Comput 8(3):204–210

    Article  Google Scholar 

  6. Liang JJ, Qin AK, Suganthan PN, Baskar S (2006) Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans Evol Comput 10(3):281– 295

    Article  Google Scholar 

  7. Jeevan M, Hanmandlu M, Panigrahi BK (2016) Information set based gait authentication system. Neurocomputing 207:1–14

    Article  Google Scholar 

  8. Mamta, Hanmandlu M (2013) Robust ear based authentication using local principal independent components. Expert Syst Appl 40(16):6478–6490

    Article  Google Scholar 

  9. Gao W-F, Liu S-Y (2012) A modified artificial bee colony algorithm. Comput Oper Res 39(3):687–697

    Article  MATH  Google Scholar 

  10. Karaboga D, Gorkemli B (2014) A quick Artificial Bee Colony (qABC) algorithm and its performance on optimization problems. Appl Soft Comput 23:227–238

    Article  Google Scholar 

  11. Mohanty PK, Parhi DR (2016) Optimal path planning for a mobile robot using Cuckoo search algorithm. J Exper Theor Artif Intell 28(1-2):35–52

    Article  Google Scholar 

  12. Dokeroglu T (2015) Hybrid teaching–learning-based optimization algorithms for the quadratic assignment problem. Comput Ind Eng 85:86–101

    Article  Google Scholar 

  13. Agarwal M, Hanmandlu M (2016) Representing uncertainty with information sets. IEEE Trans Fuzzy Syst 24(1):1–15

    Article  Google Scholar 

  14. Hanmandlu M, Das A (2011) Content-based image retrieval by information theoretic measure. Def Sci J 61:415–430

    Article  Google Scholar 

  15. Allipeddi R, Suganthan PN, Pan QK (2011) Tasgetiren differential evolution algorithm with ensemble of parameters and mutation strategies. Appl Soft Comput 11:1679–1696

    Article  Google Scholar 

  16. Goldberg DE (1997) Genetic algorithms in search, optimization and machine learning addison wesley

  17. Yang XS, Deb S (2009) Cuckoo search via levy flights, World Congress on naBIC, Coimbatore, pp 210–214

  18. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248

    Article  MATH  Google Scholar 

  19. Talbi EG (2002) A taxonomy of hybrid metaheuristics. J Heuristics 8(5):541–564

    Article  Google Scholar 

  20. Sayeed F, Hanmandlu M (2017) Properties of information sets and information processing with an application to face recognition. Knowl Inf Syst 52(2):485–505

    Article  Google Scholar 

  21. Dam T (2016) Fuzzy clustering algorithm in Takagi-Sugeno fuzzy model, MS Thesis, EE Department, IIT Kharagpur

  22. García S, Molina D, Lozano M, Herrera F (2008) A Study on the Use of Non-Parametric Tests for Analyzing the Evolutionary Algorithms Behaviour: a Case Study on the CEC’2005 Special Session on Real Parameter Optimization, Journal of Heuristics. https://doi.org/10.1007/s10732-008-9080-4

  23. Olmo JL, Luna JM, Romero JR, Ventura S (2013) Mining association rules with single and multi-objective grammar guided ant programming. Integrated Comput-Aid Eng 20(3):217–234

    Article  Google Scholar 

  24. Olmo JL, Romero JR, Ventura S (2011) Using ant programming guided by grammar for building rule-based classifiers. IEEE Trans Syst, Man, Cybern: Part B 41(6):1585–1599

    Article  Google Scholar 

  25. Qing W, Cole C, McSweeney T (2016) Applications of particle swarm optimization in the railway domain. Int J Rail Trans 4:3

    Google Scholar 

  26. Ma R-J, Nan-Yang Y, Jun-Yi H (2013) Application of particle swarm optimization algorithm in the heating system planning problem. Sci World J 2013:11. Article ID 718345

    Google Scholar 

  27. Das MK, Kumar K, Barman TK, Sahoo P (2014) Application of artificial bee colony algorithm for optimization of MRR and surface roughness in EDM of EN31 tool steel. Proc Mater Sci 6:741–751

    Article  Google Scholar 

  28. Xu C, Duan H (2010) Artificial bee colony (ABC) optimized edge potential function (EPF) approach to target recognition for low-altitude aircraft. Pattern Recogn Lett 31(13):1759–1772

    Article  Google Scholar 

  29. Sopa M, Angkawisittpan N (2016) An Application of Cuckoo Search Algorithm for Series System with Cost and Multiple Choices Constraints. Proc Comput Sci 83:453–456

    Article  Google Scholar 

  30. Devi S, Geethanjali M (2014) Application of Modified Bacterial Foraging Optimization algorithm for optimal placement and sizing of Distributed Generation, 41(6):2772–2781

  31. Ventura S, Luna JM (2016) Pattern mining with evolutionary algorithms. Springer, Berlin, pp 1–190. ISBN 978-3-319-33857-6

    MATH  Google Scholar 

  32. Luna JM, Romero JR, Romero C, Ventura S (2014) On the use of genetic programming for mining comprehensible rules in subgroup discovery. IEEE Trans Cybern 44(12):2329–2341

    Article  Google Scholar 

  33. Geem ZW, Lee KS, Park Y (2005) Application of harmony search to vehicle routing. Am J Appl Sci 2 (1):1552–1557

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jyotsana Grover.

Appendix A: Some standard functions

Appendix A: Some standard functions

The functions tested in the paper are as follows:

  1. 1.

    Sphere function \(f_{1}(x)={\sum }_{i = 1}^{n}{x_{i}^{2}}\) has global minimum \(x_{i}^{*}= 0\) and minimum function value f(x) = 0.

  2. 2.

    Rosenbrock function \(f_{2}(x)={\sum }_{i = 1}^{n}(x_{i + 1}-x_{i})^{2}+(x_{i}-1)^{2}\) has global minimum \(x_{i}^{*}= 0\) and minimum function value f(x) = 0.

  3. 3.

    Sum of different power funcions \(f_{3}(x)={\sum }_{i = 1}^{n}|x_{i}^{i + 1}|\) has global minimum \(x_{i}^{*}= 0\) and minimum function value f(x) = 0.

  4. 4.

    Schwefel26 function \(f_{4}(x)=x_{i}sin(\sqrt {x_{i}})\) has global minimum \(x_{i}^{*}= 0\) and minimum function value f(x) = 0.

  5. 5.

    Rastrigin function \(f_{5}(x)={\sum }_{i = 1}^{n}(x_{i}-10cos(2\pi x_{i})+ 10)\) has global minimum \(x_{i}^{*}= 0\) and minimum function value as f(x) = 0.

  6. 6.

    Griewank function \(f_{6}(x)=\frac {1}{4000}{\sum }_{i = 1}^{n}{x_{i}^{2}}-{\prod }_{i = 1}^{n}cos(\frac {x_{i}}{\sqrt {i}})+ 1\) has global minimum \(x_{i}^{*}= 0\) and minimum function value as f(x) = 0.

  7. 7.

    Qing function \(f_{7}(x)={\sum }_{i = 1}^{n}(x_{i}-i)^{2}\) has global minimum \(x_{i}^{*}=\sqrt {i}\) and minimum function value f(x) = 0.

  8. 8.

    Alpine01 function f8(x) = |xisin(xi) + 0.1xi| has global minimum \(x_{i}^{*}= 0\) and minimum function value as f(x) = 0.

  9. 9.

    Ackleley function \(f_{9}(x)= -20 e^{(-0.2 \sqrt {\frac {1}{n} {\sum }_{i = 1}^{n} {x_{i}^{2}}})} - e^{(-\frac {1}{n} {\sum }_{i = 1}^{n} cos(2\pi x_{i}))} + 20 + e\) has global minimum \(x_{i}^{*}= 0\) and minimum function value as f(x) = 0.

  10. 10.

    Zakharov function \(f_{10}(x)={\sum }_{i = 1}^{n} {x_{i}^{2}}+({\sum }_{i = 1}^{n} 0.5ix_{i})^{2} + ({\sum }_{i = 1}^{n} 0.5ix_{i})^{4}\) has global minimum \(x_{i}^{*}= 0\) and minimum function value as f(x) = 0.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Grover, J., Hanmandlu, M. New evolutionary optimization method based on information sets. Appl Intell 48, 3394–3410 (2018). https://doi.org/10.1007/s10489-018-1154-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-018-1154-x

Keywords

Navigation