Abstract
The fuzzy min–max (FMM) neural network effectively solves classification problems. Despite its success, it has been observed recently that FMM has overlapping between hyper-boxes in some datasets which certainly the overall classification performance, as well as FMM has a high compactional complexity, especially when dealing with high-dimensional datasets. a hybrid model combining Arithmetic Optimization Algorithm (AOA) and Accelerated fuzzy min–max (AFMM) neural network is proposed to produce an AFMM-AOA model, where AFMM is used to speed up the hyper-boxes contraction process and to reduce the number of hyper-boxes, then AOA is employed for selecting the optimal feature set in each hyper-box, which results in lowering the compactional complexity and overcoming the overlapping problem. Furthermore, the AOA algorithm has been modified (MAOA) to enhance the exploiting ability of the original AOA algorithm for handling the high dimensionality in hyper-box representation by introducing both random and neighbor search methods. The performance of the proposed methods is evaluated using twelve datasets, as a result, the neighbor search method shows better performance than the random search. In addition, both methods showed superior performance compared with the original AOA and some state-of-the-art algorithms.










Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
All benchmark datasets used in this study are publicly available at UCI machine learning repository (http://archive.ics.uci.edu/ml) [50].
References
Simpson PK (1992) Fuzzy min—max neural networks—part 1: classification. IEEE Trans on Neural Networks 3(5):776–786
Simpson PK (1993) Fuzzy min-max neural networks-part 2: Clustering. IEEE Trans Fuzzy Syst 1(1):32
Khuat TT, Ruta D, Gabrys B (2021) Hyper-box-based machine learning algorithms: a comprehensive survey. Soft Comput 25(2):1325–1363
Rudin C (2019) Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead. Nature Machine Intelligence 1(5):206–215
Khuat TT, Ruta D, Gabrys B (2021) Hyperbox-based machine learning algorithms: a comprehensive survey. Soft Comput 25(2):1325–1363
Carpenter GA, Grossberg S (1987) A massively parallel architecture for a self-organizing neural pattern recognition machine. Computer vision, graphics, and image processing 37(1):54–115
Zhang H, Liu J, Ma D, Wang Z (2011) Data-core-based fuzzy min–max neural network for pattern classification. IEEE Trans Neural Networks 22(12):2339–2352
Davtalab R, Dezfoulian MH, Mansoorizadeh M (2013) Multi-level fuzzy min-max neural network classifier. IEEE transactions on neural networks and learning systems 25(3):470–482
Gabrys, B. (2002, May). Combining neuro-fuzzy classifiers for improved generalisation and reliability. In Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No. 02CH37290) (Vol. 3, pp. 2410–2415). IEEE.
Khuat TT, Gabrys B (2021) Accelerated learning algorithms of general fuzzy min-max neural network using a novel hyper-box selection rule. Inf Sci 547:887–909
Quteishat A, Lim CP, Tan KS (2010) A modified fuzzy min–max neural network with a genetic-algorithm-based rule extractor for pattern classification. IEEE Trans Syst Man Cyber-Part A: Syst Humans 40(3):641–650
Liu J, Yu J, Z, Ma D (2012) An adaptive fuzzy min-max neural network classifier based on principle component analysis and adaptive genetic algorithm. Math Prob Eng 2012.
Shunmugapriya P, Kanmani S (2017) A hybrid algorithm using ant and bee colony optimization for feature selection and classification (AC-ABC Hybrid). Swarm Evol Comput 36:27–36
Hancer E, Xue B, Zhang M, Karaboga D, Akay B (2018) Pareto front feature selection based on artificial bee colony optimization. Inf Sci 422:462–479
Zhang Y, Gong D, Hu Y, Zhang W (2015) Feature selection algorithm based on bare bones particle swarm optimization. Neurocomputing 148:150–157
Wang KJ, Chen KH, Angelia MA (2014) An improved artificial immune recognition system with the opposite sign test for feature selection. Knowl-Based Syst 71:126–145
Marill T, Green D (1963) On the effectiveness of receptors in recognition systems. IEEE Trans Inf Theory 9(1):11–17
Alzaqebah M, Alrefai N, Ahmed EA, Jawarneh S, Alsmadi MK (2020) Neighborhood search methods with moth optimization algorithm as a wrapper method for feature selection problems. Int J Electr Comput Eng (2088–8708), 10(4).
Ma B, Xia Y (2017) A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl Soft Comput 58:328–338
Abu Khurma R, Aljarah I, Sharieh A, Abd Elaziz M, Damaševičius R, Krilavičius T (2022) A review of the modification strategies of the nature inspired algorithms for feature selection problem. Mathematics 10(3):464
Ishibuchi H, Murata T, Türkşen IB (1997) Single-objective and two-objective genetic algorithms for selecting linguistic rules for pattern classification problems. Fuzzy Sets Syst 89(2):135–150
Xue B, Zhang M, Browne WN (2012) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics 43(6):1656–1671
Aghdam MH, Ghasem-Aghaee N, Basiri ME (2009) Text feature selection using ant colony optimization. Expert Syst Appl 36(3):6843–6853
Talbi EG (ed) (2013) Hybrid metaheuristics (Vol. 166). Springer, Heidelberg
Zhang L, Mistry K, Lim CP, Neoh SC (2018) Feature selection using firefly optimization for classification and regression models. Decis Support Syst 106:64–85
Alzaqebah M, Jawarneh S, Mohammad RMA, Alsmadi MK, Almarashdeh I (2021) Improved multi-verse optimizer feature selection technique with application to phishing, spam, and denial of service attacks. Int J Commun Netw Inf Secur 13(1):76–81
Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381
Agrafiotis DK, Cedeno W (2002) Feature selection for structure− activity correlation using binary particle swarms. J Med Chem 45(5):1098–1107
Brezočnik L, Fister I Jr, Podgorelec V (2018) Swarm intelligence algorithms for feature selection: a review. Appl Sci 8(9):1521
Pourpanah F, Tan CJ, Lim CP, Mohamad-Saleh J (2017) A Q-learning-based multi-agent system for data classification. Appl Soft Comput 52:519–531
Alwohaibi M, Alzaqebah M, Alotaibi NM, Alzahrani AM, Zouch M (2022) A hybrid multi-stage learning technique based on brain storming optimization algorithm for breast cancer recurrence prediction. J King Saud Univ-Comput Inf Sci 34(8):5192–5203
Alzaqebah M, Jawarneh S, Mohammad RMA, Alsmadi MK, Al-Marashdeh I, Ahmed EA, Alghamdi FA (2021) Hybrid feature selection method based on particle swarm optimization and adaptive local search method. Int J Electr Comput Eng 11(3):2414
Thaher T, Chantar H, Too J, Mafarja M, Turabieh H, Houssein EH (2022) Boolean particle swarm optimization with various evolutionary population dynamics approaches for feature selection problems. Expert Syst Appl 195:116550
Faizan M, Alsolami F, Khan RA (2022) Hybrid binary butterfly optimization algorithm and simulated annealing for feature selection problem. Int J Appl Metaheuristic Comput (IJAMC) 13(1):1–18
Wang X, Wang Y, Wong KC, Li X (2022) A self-adaptive weighted differential evolution approach for large-scale feature selection. Knowl-Based Syst 235:107633
Pourpanah F, Lim CP, Hao Q (2019) A reinforced fuzzy ARTMAP model for data classification. Int J Mach Learn Cybern 10(7):1643–1655
Riquelme JC, Aguilar JS, Toro M (1999) A decision queue based on genetic algorithms: axis-paralle classifier versus rotated hyper-boxes. Comput Intell Appl 123–128.
Pourpanah F, Lim CP, Saleh JM (2016) A hybrid model of fuzzy ARTMAP and genetic algorithm for data classification and rule extraction. Expert Syst Appl 49:74–85
Wang Y, Huang W, Wang J (2021) Redefined fuzzy min-max neural network. In 2021 International Joint Conference on Neural Networks (IJCNN) (pp. 1–8). IEEE.
Pourpanah F, Lim CP, Wang X, Tan CJ, Seera M, Shi Y (2019) A hybrid model of fuzzy min–max and brain storm optimization for feature selection and data classification. Neurocomputing 333:440–451
Abualigah L, Diabat A, Mirjalili S, Abd Elaziz M, Gandomi AH (2021) The arithmetic optimization algorithm. Comput Methods Appl Mech Eng 376:113609
Hijjawi M, Alshinwan M, Khashan OA, Alshdaifat M, Almanaseer W, Alomoush W, bualigah, L. (2023) Accelerated arithmetic optimization algorithm by cuckoo search for solving engineering design problems. Processes 11(5):1380
Abualigah L, Diabat A (2022) Improved multi-core arithmetic optimization algorithm-based ensemble mutation for multidisciplinary applications. J Intell Manuf 1–42.
Dhal KG, Sasmal B, Das A, Ray S, Rai R (2023) A Comprehensive Survey on Arithmetic Optimization Algorithm. Arch Comput Methods Eng 1–26.
Abualigah L, Ewees AA, Al-qaness MA, Elaziz MA, Yousri D, Ibrahim RA, Altalhi M (2022) Boosting arithmetic optimization algorithm by sine cosine algorithm and levy flight distribution for solving engineering optimization problems. Neural Comput Appl 34(11):8823–8852
Pourpanah F, Wang D, Wang R, Lim CP (2021) A semisupervised learning model based on fuzzy min–max neural networks for data classification. Appl Soft Comput 112:107856
Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14
Emary E, Zawbaa HM, Hassanien AE (2016) Binary ant lion approaches for feature selection. Neurocomputing 213:54–65
Hans R, Kaur H (2020) Hybrid binary Sine Cosine Algorithm and Ant Lion Optimization (SCALO) approaches for feature selection problem. Int J Comput Mater Sci Eng 9(01):1950021
Dua D, Graff C (2019) UCI machine learning repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science. IEEE Trans Pattern Anal Mach Intell.
Debuse JC, Rayward-Smith VJ (1997) Feature subset selection within a simulated annealing data mining algorithm. J Intell Inf Syst 9(1):57–81
Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Alzaqebah, M., Ahmed, E.A.E. Accelerated fuzzy min–max neural network and arithmetic optimization algorithm for optimizing hyper-boxes and feature selection. Neural Comput & Applic 36, 1553–1568 (2024). https://doi.org/10.1007/s00521-023-09131-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-023-09131-6