Skip to main content
Log in

SRIME: a strengthened RIME with Latin hypercube sampling and embedded distance-based selection for engineering optimization problems

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

This paper proposes a strengthened RIME algorithm to tackle continuous optimization problems. RIME is a newly proposed physical-based evolutionary algorithm (EA) inspired by the soft and hard rime growth process of rime-ice, which has a powerful exploitation ability. But in complex optimization problems, RIME will easily trap into local optima and the optimization will become stagnation. Noticing this issue, we introduce three techniques to the original RIME: (1) Latin hypercube sampling replaces the random generator as the initialization strategy, (2) modified hard rime search strategy, and (3) embedded distance-based selection mechanism. We evaluate our proposed SRIME in 10-D, 30-D, 50-D, and 100-D CEC2020 benchmark functions and eight real-world engineering optimization problems with nine state-of-the-art EAs. Experimental and statistical results show that the introduction of three techniques can significantly accelerate the optimization of the RIME algorithm, and SRIME is a competitive optimization technique in real-world applications. Ablation experiments are also provided to analyze our proposed three techniques independently, and the embedded distance-based selection contributes most to the improvement of SRIME. The source code of SRIME can be found in https://github.com/RuiZhong961230/SRIME.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Algorithm 1
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

This research code can be downloaded from https://github.com/RuiZhong961230/SRIME.

Notes

  1. Pictures are downloaded from https://pixabay.com/ as copyright-free images. (a) https://pixabay.com/photos/barbed-wire-frost-frozen-cold-ice-1938842/. (c) https://pixabay.com/photos/thuja-ice-winter-cold-frozen-6015613/.

References

  1. Azizi M (2021) Atomic orbital search: a novel metaheuristic algorithm. Appl Math Model 93:657–683. https://doi.org/10.1016/j.apm.2020.12.021

    Article  MathSciNet  Google Scholar 

  2. Rana N, Latiff MSA, Abdulhamid SM, Misra S (2022) A hybrid whale optimization algorithm with differential evolution optimization for multi-objective virtual machine scheduling in cloud computing. Eng Optim 54(12):1999–2016. https://doi.org/10.1080/0305215X.2021.1969560

    Article  MathSciNet  Google Scholar 

  3. Zhong R, Zhang E, Munetomo M (2023) Cooperative coevolutionary surrogate ensemble-assisted differential evolution with efficient dual differential grouping for large-scale expensive optimization problems. Complex Intell Syst. https://doi.org/10.1007/s40747-023-01262-6

    Article  Google Scholar 

  4. Zhong R, Zhang E, Munetomo M (2023) Cooperative coevolutionary differential evolution with linkage measurement minimization for large-scale optimization problems in noisy environments. Complex Intell Syst 9:4439–4456. https://doi.org/10.1007/s40747-022-00957-6

    Article  Google Scholar 

  5. Neggaz N, Houssein EH, Hussain K (2020) An efficient henry gas solubility optimization for feature selection. Expert Syst Appl 152:113364. https://doi.org/10.1016/j.eswa.2020.113364

    Article  Google Scholar 

  6. Zhong R, Peng F, Zhang E, Yu J, Munetomo M (2023) Vegetation evolution with dynamic maturity strategy and diverse mutation strategy for solving optimization problems. Biomimetics 8(6):454. https://doi.org/10.3390/biomimetics8060454

    Article  PubMed  PubMed Central  Google Scholar 

  7. Deng L, Liu S (2023) Snow ablation optimizer: a novel metaheuristic technique for numerical optimization and engineering design. Expert Syst Appl 225:120069. https://doi.org/10.1016/j.eswa.2023.120069

    Article  Google Scholar 

  8. De Jong K (1988) Learning with genetic algorithms: an overview. Mach Learn 3:121–138. https://doi.org/10.1007/BF00113894

    Article  Google Scholar 

  9. Storn R, Price K (1997) Differential evolution: a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359. https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  Google Scholar 

  10. Koza JR (1994) Genetic programming as a means for programming computers by natural selection. Stat Comput 4:87–112. https://doi.org/10.1007/BF00175355

    Article  Google Scholar 

  11. Beyer H-G, Schwefel H-P (2002) Evolution strategies: a comprehensive introduction. Nat Comput 1:3–52. https://doi.org/10.1023/A:1015059928466

    Article  MathSciNet  Google Scholar 

  12. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95—international conference on neural networks, vol 4, pp 1942–19484. https://doi.org/10.1109/ICNN.1995.488968

  13. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39. https://doi.org/10.1109/MCI.2006.329691

    Article  Google Scholar 

  14. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007

    Article  Google Scholar 

  15. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008

    Article  Google Scholar 

  16. Reynolds R (1994) An introduction to cultural algorithms. In: Evolutionary programming—proceedings of the third annual conference, pp 131–139. https://doi.org/10.1142/9789814534116

  17. Acharya D, Das D (2022) A novel human conception optimizer for solving optimization problems. Sci Rep 12:21631. https://doi.org/10.1038/s41598-022-25031-6

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  18. Rahkar Farshi T (2021) Battle royale optimization algorithm. Neural Comput Appl 33:1139–1157. https://doi.org/10.1007/s00521-020-05004-4

    Article  Google Scholar 

  19. Shi Y (2011) Brain storm optimization algorithm. In: Tan Y, Shi Y, Chai Y, Wang G (eds) Advances in swarm intelligence. Springer, Berlin, Heidelberg, pp 303–309. https://doi.org/10.1007/978-3-642-21515-5_36

    Chapter  Google Scholar 

  20. Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133. https://doi.org/10.1016/j.knosys.2015.12.022

    Article  Google Scholar 

  21. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680. https://doi.org/10.1126/science.220.4598.671

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  22. Lam A, Li V (2012) Chemical reaction optimization: a tutorial. Memetic Comput 4:3–17. https://doi.org/10.1007/s12293-012-0075-1

    Article  Google Scholar 

  23. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82. https://doi.org/10.1109/4235.585893

    Article  Google Scholar 

  24. Su H, Zhao D, Heidari AA, Liu L, Zhang X, Mafarja M, Chen H (2023) RIME: a physics-based optimization. Neurocomputing 532:183–214. https://doi.org/10.1016/j.neucom.2023.02.010

    Article  Google Scholar 

  25. Stein M (1987) Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2):143–151. https://doi.org/10.1080/00401706.1987.10488205

    Article  MathSciNet  Google Scholar 

  26. Tharwat A, Schenck W (2021) Population initialization techniques for evolutionary algorithms for single-objective constrained optimization problems: deterministic vs. stochastic techniques. Swarm Evol Comput 67:100952. https://doi.org/10.1016/j.swevo.2021.100952

    Article  Google Scholar 

  27. Tsang KKT (2018) Basin of attraction as a measure of robustness of an optimization algorithm. In: 2018 14th international conference on natural computation, fuzzy systems and knowledge discovery (ICNC-FSKD), pp 133–137. https://doi.org/10.1109/FSKD.2018.8686850

  28. Ghosh A, Das S, Mallipeddi R, Das AK, Dash SS (2017) A modified differential evolution with distance-based selection for continuous optimization in presence of noise. IEEE Access 5:26944–26964

    Article  Google Scholar 

  29. Bouamama S, Jlifi B, Ghedira K (2003) D2g2a: a distributed double guided genetic algorithm for max_csps, vol 2773, pp 422–429. https://doi.org/10.1007/978-3-540-45224-9_59

  30. Van Thieu N, Mirjalili S (2023) MEALPY: an open-source library for latest meta-heuristic algorithms in Python. J Syst Archit 139:102871. https://doi.org/10.1016/j.sysarc.2023.102871

    Article  Google Scholar 

  31. Nguyen T (2020) A framework of optimization functions using Numpy (OpFuNu) for optimization problems. Zenodo. https://doi.org/10.5281/zenodo.3620960

    Article  Google Scholar 

  32. Thieu NV (2023) ENOPPY: a Python library for engineering optimization problems. Zenodo. https://doi.org/10.5281/zenodo.7953206

    Article  Google Scholar 

  33. Yue CT, Price PNSKV (2020) Problem definitions and evaluation criteria for the CEC 2020 special session and competition on single objective bound constrained numerical optimization. In: Technical Report, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore

  34. Ezugwu A, Agushaka O, Abualigah L, Mirjalili S, Gandomi A (2022) Prairie dog optimization algorithm. Neural Comput Appl 34:20017–20065. https://doi.org/10.1007/s00521-022-07530-9

    Article  Google Scholar 

  35. Hansen N, Ostermeier A (2001) Completely derandomized self-adaptation in evolution strategies. Evol Comput 9(2):159–195. https://doi.org/10.1162/106365601750190398

    Article  CAS  PubMed  Google Scholar 

  36. Zhao W, Wang L, Zhang Z (2020) Artificial ecosystem-based optimization: a novel nature-inspired meta-heuristic algorithm. Neural Comput Appl 32:1–43. https://doi.org/10.1007/s00521-019-04452-x

    Article  Google Scholar 

  37. Trojovská E, Dehghani M, Trojovský P (2022) Zebra optimization algorithm: a new bio-inspired optimization algorithm for solving optimization algorithm. IEEE Access 10:49445–49473. https://doi.org/10.1109/ACCESS.2022.3172789

    Article  Google Scholar 

  38. Ayyarao TSLV, Ramakrishna NSS, Elavarasan RM, Polumahanthi N, Rambabu M, Saini G, Khan B, Alatas B (2022) War strategy optimization algorithm: a new effective metaheuristic algorithm for global optimization. IEEE Access 10:25073–25105. https://doi.org/10.1109/ACCESS.2022.3153493

    Article  Google Scholar 

  39. Azizi M, Aickelin U, Khorshidi H, Baghalzadeh Shishehgarkhaneh M (2023) Energy valley optimizer: a novel metaheuristic algorithm for global and engineering optimization. Sci Rep 13:226. https://doi.org/10.1038/s41598-022-27344-y

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  40. Coello Coello CA (2002) Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput Methods Appl Mech Eng 191(11):1245–1287. https://doi.org/10.1016/S0045-7825(01)00323-1

    Article  ADS  MathSciNet  Google Scholar 

  41. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6(2):65–70

    MathSciNet  Google Scholar 

  42. Gao Z-M, Zhao J, Yang Y, Tian X-J (2020) The hybrid grey wolf optimization-slime mould algorithm. J Phys: Conf Ser 1617(1):012034. https://doi.org/10.1088/1742-6596/1617/1/012034

    Article  Google Scholar 

  43. Singh S, Singh U (2023) A novel self-adaptive hybrid slime mould naked mole-rat algorithm for numerical optimization and energy-efficient wireless sensor network. Concurr Comput Pract Exp 35:e7809. https://doi.org/10.1002/cpe.7809

    Article  Google Scholar 

  44. Fadheel BA, Wahab NIA, Mahdi AJ, Premkumar M, Radzi MABM, Soh ABC, Veerasamy V, Irudayaraj AXR (2023) A hybrid grey wolf assisted-sparrow search algorithm for frequency control of RE integrated system. Energies 16(3):1177. https://doi.org/10.3390/en16031177

    Article  Google Scholar 

  45. Xie W, Xing C, Wang J, Guo S, Guo M-W, Zhu L-F (2020) Hybrid Henry gas solubility optimization algorithm based on the Harris Hawk optimization. IEEE Access 8:144665–144692. https://doi.org/10.1109/ACCESS.2020.3014309

    Article  Google Scholar 

  46. Zhong R, Peng F, Yu J, Munetomo M (2024) Q-learning based vegetation evolution for numerical optimization and wireless sensor network coverage optimization. Alex Eng J 87:148–163. https://doi.org/10.1016/j.aej.2023.12.028

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by JSPS KAKENHI Grant Number JP20K11967, 21A402, and JST SPRING Grant Number JPMJSP2119.

Author information

Authors and Affiliations

Authors

Contributions

RZ: Conceptualization, Methodology, Investigation, Writing—original draft, Writing—review & editing, and Funding acquisition. JY: Investigation, Methodology, Formal Analysis, and Writing—review & editing. CZ: Conceptualization and Writing—review & editing. MM: Writing—review & editing, and Project administration.

Corresponding author

Correspondence to Rui Zhong.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhong, R., Yu, J., Zhang, C. et al. SRIME: a strengthened RIME with Latin hypercube sampling and embedded distance-based selection for engineering optimization problems. Neural Comput & Applic 36, 6721–6740 (2024). https://doi.org/10.1007/s00521-024-09424-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-024-09424-4

Keywords

Navigation