Abstract
As a novel learning algorithm for a single hidden-layer feedforward neural network, the extreme learning machine has attracted much research attention for its fast training speed and good generalization performances. Instead of iteratively tuning the parameters, the extreme machine can be seen as a linear optimization problem by randomly generating the input weights and hidden biases. However, the random determination of the input weights and hidden biases may bring non-optimal parameters, which have a negative impact on the final results or need more hidden nodes for the neural network. To overcome the above drawbacks caused by the non-optimal input weights and hidden biases, we propose a new hybrid learning algorithm named dolphin swarm algorithm extreme learning machine adopting the dolphin swarm algorithm to optimize the input weights and hidden biases efficiently. Each set of input weights and hidden biases is encoded into one vector, namely the dolphin. The dolphins are evaluated by root mean squared error and updated by the four pivotal phases of the dolphin swarm algorithm. Eventually, we will obtain an optimal set of input weights and hidden biases. To evaluate the effectiveness of our method, we compare the proposed algorithm with the standard extreme learning machine and three state-of-the-art methods, which are the particle swarm optimization extreme learning machine, evolutionary extreme learning machine, and self-adaptive evolutionary extreme learning machine, under 13 benchmark datasets obtained from the University of California Irvine Machine Learning Repository. The experimental results demonstrate that the proposed method can achieve superior generalization performances than all the compared algorithms.
Similar content being viewed by others
References
Huang, Guang-Bin, Qin-Yu Zhu, and Chee-Kheong Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. Neural Networks, 2004. Proceedings. 2004 I.E. International Joint Conference on. Vol. 2. IEEE, 2004.
Huang G-B, Siew C-K. Extreme learning machine with randomly assigned RBF kernels. Int J Inf Technol. 2005;11(1):16–24.
Huang, Guang-Bin, and Chee-Kheong Siew. Extreme learning machine: RBF network case. Control, Automation, Robotics and Vision Conference, 2004. ICARCV 2004 8th. Vol. 2. IEEE, 2004.
Huang G-B, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Transactions on Neural Networks. 2006;17(4):879–92.
Huang G-B, Chen L. Convex incremental extreme learning machine. Neurocomputing. 2007;70(16):3056–62.
Duan L, et al. A voting optimized strategy based on ELM for improving classification of motor imagery BCI data. Cogn Comput. 2014;6.3:477–83.
Akusok A, et al. A two-stage methodology using K-NN and false-positive minimizing ELM for nominal data classification. Cogn Comput. 2014;6(3):432–45.
Cao K, et al. Classification of uncertain data streams based on extreme learning machine. Cogn Comput. 2015;7.1:150–60.
Zhao Z, et al. A class incremental extreme learning machine for activity recognition. Cogn Comput. 2014;6(3):423–31.
Zhang S, et al. Fast image recognition based on independent component analysis and extreme learning machine. Cogn Comput. 2014;6.3:405–22.
He B, et al. Fast face recognition via sparse coding and extreme learning machine. Cogn Comput. 2014;6(2):264–77.
Xie SJ, et al. Feature component-based extreme learning machines for finger vein recognition. Cogn Comput. 2014;6.3:446–61.
Vong C-M, et al. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput. 2015;7.3:381–91.
Xia S-X, et al. A kernel clustering-based possibilistic fuzzy extreme learning machine for class imbalance learning. Cogn Comput. 2015;7.1:74–85.
Sachnev V, et al. A cognitive ensemble of extreme learning machines for steganalysis based on risk-sensitive hinge loss function. Cogn Comput. 2015;7.1:103–10.
Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1):489–501.
Hagan MT, Menhaj MB. Training feedforward networks with the Marquardt algorithm. IEEE transactions on Neural Networks. 1994;5(6):989–93.
Levenberg K. A method for the solution of certain non-linear problems in least squares. Q Appl Math. 1944;2(2):164–8.
Zhu Q-Y, et al. Evolutionary extreme learning machine. Pattern Recogn. 2005;38(10):1759–63.
Cao J, Lin Z, Huang G-B. Self-adaptive evolutionary extreme learning machine. Neural Process Lett. 2012;36(3):285–305.
Xu, You, and Yang Shu. Evolutionary extreme learning machine–based on particle swarm optimization. International Symposium on Neural Networks. Springer Berlin Heidelberg, 2006.
Saraswathi S, et al. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented. IEEE/ACM Transactions on Computational Biology and Bioinformatics. 2011;8.2:452–63.
Silva, Danielle NG, Luciano DS Pacifico, and Teresa Bernarda Ludermir. An evolutionary extreme learning machine based on group search optimization. 2011 I.E. Congress of Evolutionary Computation (CEC). IEEE, 2011.
Drigo, M., V. Maniezzo, and A. Colorni. The ant system: optimization by a colony of cooperation agents. IEEE Trans Syst, Man, Cybernet Part B. 1996: 29–41.
Dorigo M, Gambardella LM. Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans Evol Comput. 1997;1.1:53–66.
Dorigo M, Birattari M, Stutzle T. Ant colony optimization. IEEE Comput Intell Mag. 2006;1(4):28–39.
Karaboga D, Basturk B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. International Fuzzy Systems Association World Congress. Springer Berlin Heidelberg, 2007.
Karaboga D, Basturk B. On the performance of artificial bee colony (ABC) algorithm. Appl Soft Comput. 2008;8(1):687–97.
Karaboga D, Akay B. A comparative study of artificial bee colony algorithm. Appl Math Comput. 2009;214(1):108–32.
Yang X-S. Firefly algorithm, stochastic test functions and design optimisation. International Journal of Bio-Inspired Computation. 2010;2(2):78–84.
Yang, Xin-She Nature-inspired metaheuristic algorithms. Luniver Press. Beckington. 2008.
Taormina R, Chau K-W. Data-driven input variable selection for rainfall–runoff modeling using binary-coded particle swarm optimization and extreme learning machines. J Hydrol. 2015;529:1617–32.
Zhang J, Chau K-W. Multilayer ensemble pruning via novel multi-sub-swarm particle swarm optimization. Journal of Universal Computer Science. 2009;15(4):840–58.
Tian-qi WU, Min YAO, Jian-hua YANG. Dolphin swarm algorithm. Frontiers of Information Technology & Electronic Engineering. 2016;707–729
Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Trans Evol Comput. 1999;3(2):82–102.
A. Frank and A. Asuncion, UCI Machine Learning Repository, Univ. California, Sch. Inform. Comput. Sci., Irvine, CA, 2011 [Online]. Available: http://archive.ics.uci.edu/ml.
Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.
Huang G-B. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput. 2015;7.3:263–78.
Huang G-B, et al. Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2012;42.2:513–29.
Acknowledgements
This paper is the partial achievement of Project 61272261 supported by the National Natural Science Foundation of China.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Informed Consent
Informed consent was not required as no human or animal was involved.
Ethical Approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Rights and permissions
About this article
Cite this article
Wu, T., Yao, M. & Yang, J. Dolphin Swarm Extreme Learning Machine. Cogn Comput 9, 275–284 (2017). https://doi.org/10.1007/s12559-017-9451-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-017-9451-y