Skip to main content
Log in

Dolphin Swarm Extreme Learning Machine

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

As a novel learning algorithm for a single hidden-layer feedforward neural network, the extreme learning machine has attracted much research attention for its fast training speed and good generalization performances. Instead of iteratively tuning the parameters, the extreme machine can be seen as a linear optimization problem by randomly generating the input weights and hidden biases. However, the random determination of the input weights and hidden biases may bring non-optimal parameters, which have a negative impact on the final results or need more hidden nodes for the neural network. To overcome the above drawbacks caused by the non-optimal input weights and hidden biases, we propose a new hybrid learning algorithm named dolphin swarm algorithm extreme learning machine adopting the dolphin swarm algorithm to optimize the input weights and hidden biases efficiently. Each set of input weights and hidden biases is encoded into one vector, namely the dolphin. The dolphins are evaluated by root mean squared error and updated by the four pivotal phases of the dolphin swarm algorithm. Eventually, we will obtain an optimal set of input weights and hidden biases. To evaluate the effectiveness of our method, we compare the proposed algorithm with the standard extreme learning machine and three state-of-the-art methods, which are the particle swarm optimization extreme learning machine, evolutionary extreme learning machine, and self-adaptive evolutionary extreme learning machine, under 13 benchmark datasets obtained from the University of California Irvine Machine Learning Repository. The experimental results demonstrate that the proposed method can achieve superior generalization performances than all the compared algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Huang, Guang-Bin, Qin-Yu Zhu, and Chee-Kheong Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. Neural Networks, 2004. Proceedings. 2004 I.E. International Joint Conference on. Vol. 2. IEEE, 2004.

  2. Huang G-B, Siew C-K. Extreme learning machine with randomly assigned RBF kernels. Int J Inf Technol. 2005;11(1):16–24.

    CAS  Google Scholar 

  3. Huang, Guang-Bin, and Chee-Kheong Siew. Extreme learning machine: RBF network case. Control, Automation, Robotics and Vision Conference, 2004. ICARCV 2004 8th. Vol. 2. IEEE, 2004.

  4. Huang G-B, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Transactions on Neural Networks. 2006;17(4):879–92.

    Article  PubMed  Google Scholar 

  5. Huang G-B, Chen L. Convex incremental extreme learning machine. Neurocomputing. 2007;70(16):3056–62.

    Article  Google Scholar 

  6. Duan L, et al. A voting optimized strategy based on ELM for improving classification of motor imagery BCI data. Cogn Comput. 2014;6.3:477–83.

    Article  Google Scholar 

  7. Akusok A, et al. A two-stage methodology using K-NN and false-positive minimizing ELM for nominal data classification. Cogn Comput. 2014;6(3):432–45.

    Article  Google Scholar 

  8. Cao K, et al. Classification of uncertain data streams based on extreme learning machine. Cogn Comput. 2015;7.1:150–60.

    Article  Google Scholar 

  9. Zhao Z, et al. A class incremental extreme learning machine for activity recognition. Cogn Comput. 2014;6(3):423–31.

    Article  Google Scholar 

  10. Zhang S, et al. Fast image recognition based on independent component analysis and extreme learning machine. Cogn Comput. 2014;6.3:405–22.

    Article  Google Scholar 

  11. He B, et al. Fast face recognition via sparse coding and extreme learning machine. Cogn Comput. 2014;6(2):264–77.

    Google Scholar 

  12. Xie SJ, et al. Feature component-based extreme learning machines for finger vein recognition. Cogn Comput. 2014;6.3:446–61.

    Article  Google Scholar 

  13. Vong C-M, et al. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput. 2015;7.3:381–91.

    Article  Google Scholar 

  14. Xia S-X, et al. A kernel clustering-based possibilistic fuzzy extreme learning machine for class imbalance learning. Cogn Comput. 2015;7.1:74–85.

    Article  Google Scholar 

  15. Sachnev V, et al. A cognitive ensemble of extreme learning machines for steganalysis based on risk-sensitive hinge loss function. Cogn Comput. 2015;7.1:103–10.

    Article  Google Scholar 

  16. Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1):489–501.

    Article  Google Scholar 

  17. Hagan MT, Menhaj MB. Training feedforward networks with the Marquardt algorithm. IEEE transactions on Neural Networks. 1994;5(6):989–93.

    Article  CAS  PubMed  Google Scholar 

  18. Levenberg K. A method for the solution of certain non-linear problems in least squares. Q Appl Math. 1944;2(2):164–8.

    Article  Google Scholar 

  19. Zhu Q-Y, et al. Evolutionary extreme learning machine. Pattern Recogn. 2005;38(10):1759–63.

    Article  Google Scholar 

  20. Cao J, Lin Z, Huang G-B. Self-adaptive evolutionary extreme learning machine. Neural Process Lett. 2012;36(3):285–305.

    Article  Google Scholar 

  21. Xu, You, and Yang Shu. Evolutionary extreme learning machine–based on particle swarm optimization. International Symposium on Neural Networks. Springer Berlin Heidelberg, 2006.

  22. Saraswathi S, et al. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented. IEEE/ACM Transactions on Computational Biology and Bioinformatics. 2011;8.2:452–63.

    Article  Google Scholar 

  23. Silva, Danielle NG, Luciano DS Pacifico, and Teresa Bernarda Ludermir. An evolutionary extreme learning machine based on group search optimization. 2011 I.E. Congress of Evolutionary Computation (CEC). IEEE, 2011.

  24. Drigo, M., V. Maniezzo, and A. Colorni. The ant system: optimization by a colony of cooperation agents. IEEE Trans Syst, Man, Cybernet Part B. 1996: 29–41.

  25. Dorigo M, Gambardella LM. Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans Evol Comput. 1997;1.1:53–66.

    Article  Google Scholar 

  26. Dorigo M, Birattari M, Stutzle T. Ant colony optimization. IEEE Comput Intell Mag. 2006;1(4):28–39.

    Article  Google Scholar 

  27. Karaboga D, Basturk B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. International Fuzzy Systems Association World Congress. Springer Berlin Heidelberg, 2007.

  28. Karaboga D, Basturk B. On the performance of artificial bee colony (ABC) algorithm. Appl Soft Comput. 2008;8(1):687–97.

    Article  Google Scholar 

  29. Karaboga D, Akay B. A comparative study of artificial bee colony algorithm. Appl Math Comput. 2009;214(1):108–32.

    Google Scholar 

  30. Yang X-S. Firefly algorithm, stochastic test functions and design optimisation. International Journal of Bio-Inspired Computation. 2010;2(2):78–84.

    Article  Google Scholar 

  31. Yang, Xin-She Nature-inspired metaheuristic algorithms. Luniver Press. Beckington. 2008.

  32. Taormina R, Chau K-W. Data-driven input variable selection for rainfall–runoff modeling using binary-coded particle swarm optimization and extreme learning machines. J Hydrol. 2015;529:1617–32.

    Article  Google Scholar 

  33. Zhang J, Chau K-W. Multilayer ensemble pruning via novel multi-sub-swarm particle swarm optimization. Journal of Universal Computer Science. 2009;15(4):840–58.

    Google Scholar 

  34. Tian-qi WU, Min YAO, Jian-hua YANG. Dolphin swarm algorithm. Frontiers of Information Technology & Electronic Engineering. 2016;707–729

  35. Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Trans Evol Comput. 1999;3(2):82–102.

    Article  Google Scholar 

  36. A. Frank and A. Asuncion, UCI Machine Learning Repository, Univ. California, Sch. Inform. Comput. Sci., Irvine, CA, 2011 [Online]. Available: http://archive.ics.uci.edu/ml.

  37. Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.

    Article  Google Scholar 

  38. Huang G-B. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput. 2015;7.3:263–78.

    Article  Google Scholar 

  39. Huang G-B, et al. Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2012;42.2:513–29.

    Article  Google Scholar 

Download references

Acknowledgements

This paper is the partial achievement of Project 61272261 supported by the National Natural Science Foundation of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Min Yao.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Informed Consent

Informed consent was not required as no human or animal was involved.

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, T., Yao, M. & Yang, J. Dolphin Swarm Extreme Learning Machine. Cogn Comput 9, 275–284 (2017). https://doi.org/10.1007/s12559-017-9451-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-017-9451-y

Keywords

Navigation