Skip to main content
Log in

Ant colony optimization edge selection for support vector machine speed optimization

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Support vector machine (SVM) is a widely used and reliable machine learning algorithm. It has been successfully applied to many real-world problems, with remarkable results. However, it has also been observed that SVM computational complexity increases with the increase in data size. Although many SVM speed optimization techniques have been proposed in the literature, there is still need for further improvement on the performance speed and accuracy of this algorithm. In this paper, a boundary detection algorithm for SVM speed optimization called ant colony optimization instance selection algorithm (ACOISA) is proposed. ACOISA is inspired by edge selection in ant colony optimization (ACO) algorithm, and it performs two primary functions: boundary detection and boundary instance selection. In the algorithm, ACO is used for boundary detection and k-nearest neighbor algorithm is used for boundary instance selection. Different sets of experiments are carried out to validate the efficiency of the proposed technique. All the experiments were performed on 35 datasets containing three well-known e-fraud types (credit card fraud, email spam and phishing email) and 31 other datasets available at UCI data repository. The experimental results showed that the proposed technique efficiently improved SVM training speed in 100% of the datasets used for evaluation, without significantly affecting SVM classification quality. Furthermore, the Freidman’s and Holm’s post hoc tests were conducted to statistically validate the credibility of the results. The statistical test results revealed that the proposed technique is significantly faster, compared to the standard SVM and some existing instance selection techniques, in all cases.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Abbreviations

D :

Dataset

\( {\text{dist}}\left[ {a,b} \right] \) :

Distance between two data instances (instances a and b)

E :

Edge

HV:

Heuristic value

K :

Number of k-nearest neighbors

MaxG:

Maximum generation

N :

Size of the entire training set

NL:

Neighborhood list

NR:

Neighborhood range

NRuns:

Number of runs for SVM cross-validation

NSub:

Size of training subset

\( T_{\text{s}} \) :

Training subset

ABC:

Artificial bee colony

Accr.:

Accuracy

ACO:

Ant colony optimization

ACOISA:

Ant colony optimization instance selection algorithm

AFP:

Accelerated flower pollination

ALO:

Antlion optimization

ANN:

Artificial neural network

BA:

Bat algorithm

BPSO:

Binary particle swarm optimization

CSA:

Clonal selection algorithm

DBC:

Directed bee colony

DT:

Decision tree

EA:

Evolutionary algorithm

ELM:

Extreme learning machine

FCNN:

Fast condensed nearest neighbor

FFA:

Firefly algorithm

FN:

False negative

FP:

False positive

FPA:

Flower pollination algorithm

GOA:

Grasshopper optimization algorithm

GWO:

Gray wolf optimization

IG:

Information gain

ISDSP:

Instance selection based on dense spatial partitions

IWD:

Intelligent water drop

k-NN:

k-nearest neighbor

LDIS:

Local density-based instance selection

LSBO:

Local set border selector

LSCO:

Local set-based centroid selector

LSSM:

Local set-based smoother

MOCHC:

Multi-objective cross-generational elitist selection, heterogeneous recombination and cataclysmic mutation

MRMC-IWD:

Master river Multiple Creeks Intelligent Water Drops

NSGA-II:

Non-dominated sorting genetic algorithm

PSO:

Particle swarm optimization

RBF:

Radial basic function

SSA:

Social spider algorithm

Stor.:

Storage reduction

UCI:

University of California Irvine

VQN:

Vaguely quantified nearest

XLDIS:

Extended local density-based instance selection

References

  1. Arbatskaya MN, Mukhopadhaya K, Rasmusen EB (2006) The parking lot problem (December 19, 2006). https://ssrn.com/abstract=571101 or http://dx.doi.org/10.2139/ssrn.571101

  2. Adewumi AO, Ali MM (2010) A multi-level genetic algorithm for a multi-stage space allocation problem. Math Comput Model 51(1):109–126

    MathSciNet  MATH  Google Scholar 

  3. Chen J, Zhang C, Xue X, Liu C-L (2013) Fast instance selection for speeding up support vector machines. Knowl Based Syst 45:1–7

    Google Scholar 

  4. Chapelle O (2007) Training a support vector machine in the primal. Neural Comput 19(5):1155–1178

    MathSciNet  MATH  Google Scholar 

  5. Panda N, Chang EY, Wu G (2006) Concept boundary detection for speeding up SVMs. In: Proceedings of the 23rd international conference on machine learning, pp 681–688

  6. Martens D, Baesens B, Fawcett T (2011) Editorial survey: swarm intelligence for data mining. Mach Learn 82(1):1–42

    MathSciNet  Google Scholar 

  7. Wilson DR, Martinez TR (2000) Reduction techniques for instance-based learning algorithms. Mach Learn 38(3):257–286

    MATH  Google Scholar 

  8. Tian J, Yu W, Xie S (2008) An ant colony optimization algorithm for image edge detection. In: 2008 IEEE congress on evolutionary computation (IEEE world congress on computational intelligence), pp 751–756

  9. Nayak M, Dash P (2016) Edge detection improvement by ant colony optimization compared to traditional methods on brain MRI image. Commun Appl Electron (CAE) 5(8):19–23

    Google Scholar 

  10. Gautam A, Biswas M (2019) Edge detection technique using ACO with PSO for noisy image. Recent developments in machine learning and data analytics. Springer, Singapore, pp 383–396

    Google Scholar 

  11. Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF, Kittler J (2010) A review of instance selection methods. Artif Intell Rev 34(2):133–143

    Google Scholar 

  12. Brighton H, Mellish C (2002) Advances in instance selection for instance-based learning algorithms. Data Min Knowl Discov 6(2):153–172

    MathSciNet  MATH  Google Scholar 

  13. Hart P (1968) The condensed nearest neighbor rule (Corresp.). IEEE Trans Inf Theory 14(3):515–516

    Google Scholar 

  14. Angiulli F (2007) Fast nearest neighbor condensation for large data sets classification. IEEE Trans Knowl Data Eng 19(11):1450–1464

    Google Scholar 

  15. Ritter G, Woodruff H, Lowry S, Isenhour T (1975) An algorithm for a selective nearest neighbor decision rule (Corresp.). IEEE Trans Inf Theory 21(6):665–669

    MATH  Google Scholar 

  16. Chien-Hsing C, Bo-Han K, Fu C (2006) The generalized condensed nearest neighbor rule as a data reduction method. In: 18th international conference on pattern recognition (ICPR’06), pp 556–559

  17. Wilson DL (1972) Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans Syst Man Cybernet 2(3):408–421

    MathSciNet  MATH  Google Scholar 

  18. Tomek I (1976) An experiment with the edited nearest-neighbor rule. IEEE Trans Syst Man Cybernet 6(6):448–452

    MathSciNet  MATH  Google Scholar 

  19. Devijver PA (1980) On the edited nearest neighbor rule. In: Proceedings of 5th international conference on pattern recognition

  20. Vázquez F, Sánchez JS, Pla F (2005) A stochastic approach to Wilson’s editing algorithm. In: Iberian conference on pattern recognition and image analysis, pp 35–42

  21. Aha DW, Kibler D, Albert MK (1991) Instance-based learning algorithms. Mach Learn 6(1):37–66

    Google Scholar 

  22. Zhao K-P, Zhou S-G, Guan J-H, Zhou A-Y (2003) C-pruner: an improved instance pruning algorithm. In: Proceedings of the 2003 international conference on machine learning and cybernetics (IEEE Cat. No. 03EX693), pp 94–99

  23. Li Y, Hu Z, Cai Y, Zhang W (2005) Support vector based prototype selection method for nearest neighbor rules. In: International conference on natural computation, pp 528–535

  24. Srisawat A, Phienthrakul T, Kijsirikul B (2006) SV-kNNC: an algorithm for improving the efficiency of k-nearest neighbor. In: Pacific rim international conference on artificial intelligence, pp 975–979

  25. Kuncheva LI (1995) Editing for the k-nearest neighbors rule by a genetic algorithm. Pattern Recognit Lett 16(8):809–814

    Google Scholar 

  26. Kuncheva LI, Bezdek JC (1998) “Nearest prototype classification: clustering, genetic algorithms, or random search? IEEE Trans Syst Man Cybernet Part C (Appl Rev) 28(1):160–164

    Google Scholar 

  27. Cano JR, Herrera F, Lozano M (2003) Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study. IEEE Trans Evol Comput 7(6):561–575

    Google Scholar 

  28. García S, Cano JR, Herrera F (2008) A memetic algorithm for evolutionary prototype selection: a scaling up approach. Pattern Recognit 41(8):2693–2709

    MATH  Google Scholar 

  29. Garain U (2008) Prototype reduction using an artificial immune model. Pattern Anal Appl 11(3):353–363

    MathSciNet  Google Scholar 

  30. Cerveron V, Ferri FJ (2001) “Another move toward the minimum consistent subset: a tabu search approach to the condensed nearest neighbor rule. IEEE Trans Syst Man Cybernet Part B (Cybernet) 31(3):408–413

    Google Scholar 

  31. Zhang H, Sun G (2002) Optimal reference subset selection for nearest neighbor classification by tabu search. Pattern Recognit 35(7):1481–1490

    MATH  Google Scholar 

  32. Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF (2005) Sequential search for decremental edition. In: International conference on intelligent data engineering and automated learning, pp 280–285

  33. Pudil P, Ferri FJ, Novovicova J, Kittler J (1994) Floating search methods for feature selection with nonmonotonic criterion functions. In: Proceedings of the 12th IAPR international conference on pattern recognition, Vol. 3-conference c: signal processing (Cat. No. 94CH3440-5), pp 279–283

  34. Olvera-López JA, Martínez-Trinidad JF, Carrasco-Ochoa JA (2007) Restricted sequential floating search applied to object selection. In: International workshop on machine learning and data mining in pattern recognition, pp 694–702

  35. Riquelme JC, Aguilar-Ruiz JS, Toro M (2003) Finding representative patterns with ordered projections. Pattern Recognit 36(4):1009–1018

    Google Scholar 

  36. Raicharoen T, Lursinsap C (2005) A divide-and-conquer approach to the pairwise opposite class-nearest neighbor (POC-NN) algorithm. Pattern Recognit Lett 26(10):1554–1567

    Google Scholar 

  37. Narayan BL, Murthy CA, Pal SK (2006) Maxdiff kd-trees for data condensation. Pattern Recognit Lett 27(3):187–200

    Google Scholar 

  38. Caises Y, González A, Leyva E, Pérez R (2009) SCIS: combining instance selection methods to increase their effectiveness over a wide range of domains. In: International conference on intelligent data engineering and automated learning, pp 17–24

  39. Spillmann B, Neuhaus M, Bunke H, Pękalska E, Duin RP (2006) Transforming strings to vector spaces using prototype selection. In: Joint IAPR international workshops on statistical techniques in pattern recognition (SPR) and structural and syntactic pattern recognition (SSPR), pp 287–296

  40. Mollineda RA, Ferri FJ, Vidal E (2002) An efficient prototype merging strategy for the condensed 1-NN rule through class-conditional hierarchical clustering. Pattern Recognit 35(12):2771–2782

    MATH  Google Scholar 

  41. Veenman CJ, Reinders MJ (2005) The nearest subclass classifier: a compromise between the nearest mean and nearest neighbor classifier. IEEE Trans Pattern Anal Mach Intell 27(9):1417–1429

    Google Scholar 

  42. Lumini A, Nanni L (2006) A clustering method for automatic biometric template selection. Pattern Recognit 39(3):495–497

    MATH  Google Scholar 

  43. Paredes R, Vidal E (2000) Weighting prototypes-a new editing approach. In: Proceedings 15th international conference on pattern recognition. ICPR-2000, pp 25–28

  44. Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF (2008) Prototype selection via prototype relevance. In: Iberoamerican congress on pattern recognition, pp 153–160

  45. Leyva E, González A, Pérez R (2015) Three new instance selection methods based on local sets: a comparative study with several approaches from a bi-objective perspective. Pattern Recognit 48(4):1523–1537

    Google Scholar 

  46. Carbonera JL, Abel M (2015) A density-based approach for instance selection. In: 2015 IEEE 27th international conference on tools with artificial intelligence (ICTAI), Italy, pp 768–774

  47. Carbonera JL (2017) An efficient approach for instance selection. International conference on big data analytics and knowledge discovery. Springer, Cham

    Google Scholar 

  48. Carbonera JL, Abel M (2018) Efficient instance selection based on spatial abstraction. In: 2018 IEEE 30th international conference on tools with artificial intelligence (ICTAI), pp 286–292

  49. Rathee S, Ratnoo S, Ahuja J (2019) Instance selection using multi-objective CHC evolutionary algorithm. Information and communication technology for competitive strategies. Springer, Springer, pp 475–484

    Google Scholar 

  50. Anwar IM, Salama KM, Abdelbar AM (2015) ADR-miner: an ant-based data reduction algorithm for classification. In: 2015 IEEE congress on evolutionary computation (CEC), pp 515–521

  51. Tsai C-F, Cheng K-C (2012) Simple instance selection for bankruptcy prediction. Knowl-Based Syst 27:333–342

    Google Scholar 

  52. Koggalage R, Halgamuge S (2004) Reducing the number of training samples for fast support vector machine classification. Neural Inf Process Lett Rev 2(3):57–65

    Google Scholar 

  53. Dorigo M (1992) Optimization, learning and natural algorithms. Ph.D. Thesis, Politecnico di Milano, Italy

  54. Dorigo M, Blum C (2005) Ant colony optimization theory: a survey. Theor Comput Sci 344(2):243–278

    MathSciNet  MATH  Google Scholar 

  55. Katiyar S, Ansari AQ (2015) Ant colony optimization: a tutorial review. MR Int J Eng Technol 7(2):35–41

    Google Scholar 

  56. Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39

    Google Scholar 

  57. Shekhawat A, Poddar P, Boswal D (2009) Ant colony optimization algorithms: introduction and beyond. Department of Computer Science and Engineering, Indian Institute of Tecnology Bombay, Mumbai

    Google Scholar 

  58. Dorigo M (1992) Optimization, learning and natural algorithms. Ph.D. Thesis, Politecnico di Milano, Italy

  59. Dorigo M, Gambardella LM (1997) Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans Evol Comput 1(1):53–66

    Google Scholar 

  60. Stutzle T, Hoos H (1997) MAX-MIN ant system and local search for the traveling salesman problem. In: IEEE international conference on evolutionary computation, pp 309–314

  61. Gutjahr WJ (2008) First steps to the runtime complexity analysis of ant colony optimization. Comput Oper Res 35(9):2711–2727

    MathSciNet  MATH  Google Scholar 

  62. Neumann F, Sudholt D, Witt C (2009) Computational complexity of ant colony optimization and its hybridization with local search. Innovations in swarm intelligence. Springer, Berlin, Heidelberg, pp 91–120

    Google Scholar 

  63. Brighton H, Mellish C (2002) Advances in instance selection for instance-based learning algorithms. Data Min Knowl Discov 6(2):153–172

    MathSciNet  MATH  Google Scholar 

  64. Garcı S, Triguero I, Carmona CJ, Herrera F (2012) Evolutionary-based selection of generalized instances for imbalanced classification. Knowl Based Syst 25(1):3–12

    Google Scholar 

  65. Angiulli F, Astorino A (2010) Scaling up support vector machines using nearest neighbor condensation. IEEE Trans Neural Netw 21(2):351–357

    Google Scholar 

  66. Angiulli F (2007) Fast nearest neighbor condensation for large data sets classification. IEEE Trans Knowl Data Eng 19(11):1450–1464

    Google Scholar 

  67. Al-Yaseen WL, Othman ZA, Nazri MZA (2017) Multi-level hybrid support vector machine and extreme learning machine based on modified K-means for intrusion detection system. Expert Syst Appl 67:296–303

    Google Scholar 

  68. Schlag S, Schmitt M, Schulz C (2019) Faster support vector machines. In: 2019 Proceedings of the twenty-first workshop on algorithm engineering and experiments (ALENEX), pp 199–210

  69. Akinyelu AA, Adewumi AO (2018) On the performance of cuckoo search and bat algorithms based instance selection techniques for SVM speed optimization with application to e-Fraud detection. KSII Trans Internet Inf Syst 12(3):1348–1375

    Google Scholar 

  70. Arora S, Anand P (2019) Binary butterfly optimization approaches for feature selection. Expert Syst Appl 116:147–160

    Google Scholar 

  71. Agrawal S, Singh B, Kumar R, Dey N (2019) Chapter 9—Machine learning for medical diagnosis: a neural network classifier optimized via the directed bee colony optimization algorithm. In: Dey N, Ashour AS, Fong SJ, Borra S (eds) U-healthcare monitoring systems. Academic Press, Cambridge, pp 197–215

    Google Scholar 

  72. Majhi SK, Mahapatra P (2019) Classification of phishing websites using moth-flame optimized neural network. Emerging technologies in data mining and information security. Springer, Singapore, pp 39–48

    Google Scholar 

  73. Mafarja M, Aljarah I, Faris H, Hammouri AI, Al-Zoubi AM, Mirjalili S (2019) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286

    Google Scholar 

  74. Kumar L, Bharti KK (2019) An improved BPSO algorithm for feature selection. Recent trends in communication, computing, and electronics. Springer, Singapore, pp 505–513

    Google Scholar 

  75. Zendehboudi A, Baseer M, Saidur R (2018) Application of support vector machine models for forecasting solar and wind energy resources: a review. J Clean Prod 199:272–285

    Google Scholar 

  76. Zhang X, Mei C, Chen D, Yang Y (2018) A fuzzy rough set-based feature selection method using representative instances. Knowl Based Syst 151:216–229

    Google Scholar 

  77. Zawbaa HM, Emary E, Grosan C, Snasel V (2018) Large-dimensionality small-instance set feature selection: a hybrid bio-inspired heuristic approach. Swarm Evolut Comput 42:29–42

    Google Scholar 

  78. Alijla BO, Lim CP, Wong L-P, Khader AT, Al-Betar MA (2018) An ensemble of intelligent water drop algorithm for feature selection optimization problem. Appl Soft Comput 65:531–541

    Google Scholar 

  79. Cervantes J, Garcia-Lamont F, Rodriguez L, López A, Castilla JR, Trueba A (2017) PSO-based method for SVM classification on skewed data sets. Neurocomputing 228:187–197

    Google Scholar 

  80. Shunmugapriya P, Kanmani S (2017) A hybrid algorithm using ant and bee colony optimization for feature selection and classification (AC-ABC Hybrid). Swarm Evolut Comput 36:27–36

    Google Scholar 

  81. Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evolut Comput 6(2):182–197

    Google Scholar 

  82. Venkatasalam K, Rajendran P, Thangavel M (2019) Improving the accuracy of feature selection in big data mining using accelerated flower pollination (AFP) Algorithm. J Med Syst 43(4):96

    Google Scholar 

  83. Hsu C-W, Chang C-C, Lin C-J (2003) A practical guide to support vector classification. Department of Computer Science, National Taiwan University, Taipei

    Google Scholar 

  84. Fletcher T (2009) Support vector machines explained. http://www.tristanfletcher.co.uk/SVM%20Explained.pdf. Accessed 7 Dec 2016

  85. Scholkopf B, Burges C, Smola A (1998) Geometry and invariance in kernel based methods. In: Advances in kernel methods: support vector learning

  86. Akinyelu AA, Adewumi AO (2017) Improved instance selection methods for support vector machine speed optimization. Secur Commun Netw 2017:11

    Google Scholar 

  87. Hsu C-W, Chang C-C, Lin C-J (2003) A practical guide to support vector classification. Technical report, Department of Computer Science, National Taiwan University, no. 1–16

  88. Chang C-C, Lin C-J (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol (TIST) 2(3):1–27

    Google Scholar 

  89. Johnson M (2009) SVM.NET. http://www.matthewajohnson.org/software/svm.html. Accessed 5 Aug 2014

  90. Bergholz A, Chang JH, Paaß G, Reichartz F, Strobel S (2008) Improved phishing detection using model-based features. In: Proceedings of the conference on email and anti-spam (CEAS), Mountain View, CA, pp 1–27

  91. Witten IH, Frank E, Hall MA, Pal CJ (2016) Data mining: practical machine learning tools and techniques. Morgan Kaufmann, Burlington

    Google Scholar 

  92. Group C (2006) SpamAssassin Data. http://www.csmining.org/index.php/spam-assassin-datasets.html. Accessed 5 Aug 2014

  93. Nazario J (2005) Phishing Corpus. http://monkey.org/jose/wiki/doku.php?id=PhishingCorpus. Accessed 27 April 2015

  94. Asuncion A, Newman D (2007). UCI machine learning repository. http://archive.ics.uci.edu/ml/datasets.html. Accessed 15 Aug 2016

  95. Andrea (2016) Credit card fraud detection. https://www.kaggle.com/dalpozz/creditcardfraud. Accessed 12 Dec 2016

  96. Shams R, Mercer RE (2013) Classifying spam emails using text and readability features. In: IEEE 13th international conference on data mining, December 7–10, 2013, pp 657–666

  97. Zitar RA, Hamdan A (2013) Genetic optimized artificial immune system in spam detection: a review and a model. Artif Intell Rev 40(3):305–377

    Google Scholar 

  98. Graham P (2002) A plan for spam. http://www.paulgraham.com/spam.html. Accessed 04 Aug 2016

  99. Shams R, Mercer RE (2013) Classifying spam emails using text and readability features. In: 2013 IEEE 13th international conference on data mining, pp 657–666

  100. Duncan R (2016) A simple guide to HTML. http://www.simplehtmlguide.com/whatishtml.php. Accessed 13 Sept 2016

  101. Akinyelu AA, Adewumi AO (2014) Classification of phishing email using random forest machine learning technique. J Appl Math 2014:6, Article ID 425731

  102. Almomani A, Wan T-C, Altaher A, Manasrah A, ALmomani E, Anbar M et al (2012) Evolving fuzzy neural network for phishing emails detection. J Comput Sci 8(7):1099–1107

    Google Scholar 

  103. Fette I, Sadeh N, Tomasic A (2007) Learning to detect phishing emails. In: Proceedings of the 16th international conference on World Wide Web, Banff, AB, Canada, pp 649–656

  104. Zhang N, Yuan Y (2017) CS229 lecture notes, phishing detection using neural network. Department of Statistics, Stanford University, 2012. http://cs229.stanford.edu/proj2012/ZhangYuan-PhishingDetectionUsingNeuralNetwork.pdf. Accessed 10 July 2017

  105. Bergholz A, De Beer J, Glahn S, Moens M-F, Paaß G, Strobel S (2010) New filtering approaches for phishing email. J Comput Secur 18(1):7–35

    Google Scholar 

  106. Adewumi OA, Akinyelu AA (2016) A hybrid firefly and support vector machine classifier for phishing email detection. Kybernetes 45(6):977–994

    MathSciNet  Google Scholar 

  107. Basnet R, Mukkamala S, Sung AH (2008) Detection of phishing attacks: a machine learning approach. In: Prasad B (ed) Soft computing applications in industry. Springer, Berlin, pp 373–383

    Google Scholar 

  108. Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF (2010) A new fast prototype selection method based on clustering. Pattern Anal Appl 13(2):131–141

    MathSciNet  Google Scholar 

  109. Chou C-H, Kuo B-H, Chang F (2006) The generalized condensed nearest neighbor rule as a data reduction method. In: 18th international conference on pattern recognition (ICPR’06), pp 556–559

  110. Wilson DR, Martinez TR (1997) Instance pruning techniques. In: Proceedings of the fourteenth international conference on machine learning, pp 403–411

  111. Brighton H, Mellish C (1999) On the consistency of information filters for lazy learning algorithms. In: Żytkow JM, Rauch J (eds) Principles of data mining and knowledge discovery: third european conference, PKDD’99, Prague, Czech Republic, September 15–18, 1999. Proceedings. Springer, Berlin, pp 283–288

  112. Mohan A, Remya G (2014) A parallel implementation of ant colony optimization for tsp based on mapreduce framework. Int J Comput Appl 88(8):9–12

    Google Scholar 

  113. Papesca M (2017) Edge detection using parallel ant colony optimization with Hadoop MapReduce: implementation and scalability. Master of Science Department of Computer Science, State University of New York

  114. Wei W, Yang X-L, Zhou B, Feng J, Shen P-Y (2012) Combined energy minimization for image reconstruction from few views. Math Probl Eng 2012:154630. https://doi.org/10.1155/2012/154630

    Article  MathSciNet  MATH  Google Scholar 

  115. Gopalakrishnan RC, Kuppusamy V (2014) Ant colony optimization approaches to clustering of lung nodules from CT images. Comput Math Methods Med 2014:572494. https://doi.org/10.1155/2014/572494

    Article  Google Scholar 

  116. Carter RJ, Dubchak I, Holbrook SR (2001) A computational approach to identify genes for functional RNAs in genomic sequences. Nucl Acids Res 29(19):3928–3938

    Google Scholar 

  117. Salzberg S (1995) Locating protein coding regions in human DNA using a decision tree algorithm. J Comput Biol 2(3):473–485

    Google Scholar 

  118. Allen JE, Pertea M, Salzberg SL (2004) Computational gene prediction using multiple sources of evidence. Genome Res 14(1):142–148

    Google Scholar 

  119. Batrinca B, Treleaven PC (2015) Social media analytics: a survey of techniques, tools and platforms. AI & Soc 30(1):89–116

    Google Scholar 

  120. Bache K, Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml. Accessed 12 May 2017

  121. Wei W, Fan X, Song H, Wang H (2019) Video tamper detection based on multi-scale mutual information. Multimedia Tools Appl 78(19):27109–27126

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andronicus A. Akinyelu.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Akinyelu, A.A., Ezugwu, A.E. & Adewumi, A.O. Ant colony optimization edge selection for support vector machine speed optimization. Neural Comput & Applic 32, 11385–11417 (2020). https://doi.org/10.1007/s00521-019-04633-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04633-8

Keywords

Navigation