Skip to main content

Advertisement

Log in

A Meta-Cognitive Learning Algorithm for an Extreme Learning Machine Classifier

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

This paper presents an efficient fast learning classifier based on the Nelson and Narens model of human meta-cognition, namely ‘Meta-cognitive Extreme Learning Machine (McELM).’ McELM has two components: a cognitive component and a meta-cognitive component. The cognitive component of McELM is a three-layered extreme learning machine (ELM) classifier. The neurons in the hidden layer of the cognitive component employ the q-Gaussian activation function, while the neurons in the input and output layers are linear. The meta-cognitive component of McELM has a self-regulatory learning mechanism that decides what-to-learn, when-to-learn, and how-to-learn in a meta-cognitive framework. As the samples in the training set are presented one-by-one, the meta-cognitive component receives the monitory signals from the cognitive component and chooses suitable learning strategies for the sample. Thus, it either deletes the sample, uses the sample to add a new neuron, or updates the output weights based on the sample, or reserves the sample for future use. Therefore, unlike the conventional ELM, the architecture of McELM is not fixed a priori, instead, the network is built during the training process. While adding a neuron, McELM chooses the centers based on the sample, and the width of the Gaussian function is chosen randomly. The output weights are estimated using the least square estimate based on the hinge-loss error function. The hinge-loss error function facilitates prediction of posterior probabilities better than the mean-square error and is hence preferred to develop the McELM classifier. While updating the network parameters, the output weights are updated using a recursive least square estimate. The performance of McELM is evaluated on a set of benchmark classification problems from the UCI machine learning repository. Performance study results highlight that meta-cognition in ELM framework enhances the decision-making ability of ELM significantly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Platt JC. A resource allocation network for function interpolation. Neural Comput. 1997;2(2):213–25.

    Google Scholar 

  2. Yingwei L, Sundararajan N, Saratchandran P. A sequential learning scheme for function approximation using minimal radial basis function neural networks. Neural Comput. 1997;9(2):461–78.

    Article  Google Scholar 

  3. Yingwei L, Sundararajan N, Saratchandran P. Performance evaluation of a sequential minimal radial basis function (RBF) neural network learning algorithm. IEEE Trans Neural Netw. 1998;9(2):308–18.

    Article  Google Scholar 

  4. Huang GB, Saratchandran P, Sundararajan N. An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks. IEEE Trans Syst Man Cybern Part B Cybern. 2004;34(6):2284–92.

    Article  Google Scholar 

  5. Suresh S, Sundararajan N, Saratchandran P. A sequential multi-category classifier using radial basis function networks. Neurocomputing. 2008;71(7–9): 1345–58.

    Article  Google Scholar 

  6. Suresh S, Dong K, Kim HJ. A sequential learning algorithm for self-adaptive resource allocation network classifier. Neurocomputing. 2010;73(16–18):3012–9.

    Article  Google Scholar 

  7. Babu GS, Suresh S. Meta-cognitive neural network for classification problems in a sequential learning framework. Neurocomputing. 2012;81:86–96.

    Article  Google Scholar 

  8. Huang GB, Siew CK. Extreme learning machine with randomly assigned RBF kernels. Int J Inf Technol. 2005;11(1):16–24.

    Google Scholar 

  9. Huang G.-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B Cybern. 2012;42(2):513–29.

    Article  Google Scholar 

  10. Liang N-Y, Huang GB, Saratchandran P, Sundararajan N. A fast and accurate on-line sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw. 2006;17(6):1411–23.

    Article  PubMed  Google Scholar 

  11. Rong HJ, Huang GB, Sundararajan N, Saratchandran P. Online sequential fuzzy extreme learning machine for function approximation and classification problems. IEEE Trans Syst Man Cybern Part B Cybern. 2009;39(4):1067–72.

    Article  Google Scholar 

  12. Zhao J, Wang Z, Park DS. Online sequential extreme learning machine with forgetting mechanism. Neurocomputing. 2012;87:79–89.

    Article  Google Scholar 

  13. Huang GB, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.

    Article  PubMed  Google Scholar 

  14. Huang G.-B, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern. 2011;2(2):107–22.

    Article  Google Scholar 

  15. Sun Y, Yuan Y, Wang G. An OS-ELM based distributed ensemble classification framework in p2p networks. Neurocomputing. 2011;74(16):2438–43.

    Article  Google Scholar 

  16. Cheng C, Tay WP, Huang GB. Extreme learning machines for intrusion detection. International Joint Conference on Neural Networks 2012 (IJCNN 2012) (Art no. 6252449) 2012.

  17. Suresh S, Babu RV, Kim HJ. No-reference image quality assessment using modified extreme learning machine classifier. Appl Soft Comput. 2009;9(2):541–52.

    Article  Google Scholar 

  18. Decherchi S, Gastaldo P, Zunino R, Cambria E, Redi J. Circular-ELM for the reduced-reference assessment of perceived image quality. Neurocomputing. 2013;102:78–89.

    Article  Google Scholar 

  19. Babu RV, Suresh S, Makur A. Online adaptive radial basis function networks for robust object tracking. Comput Vis Image Underst. 2010;114(3):297–10.

    Article  Google Scholar 

  20. Wang G, Zhao Y, Wang DD. A protein secondary structure prediction framework based on the extreme learning machine. Neurocomputing. 2008;72(1–3):262–8.

    Article  Google Scholar 

  21. Suresh S, Saraswathi S, Sundararajan N. Performance enhancement of extreme learning machine for multi-category sparse data classification problems. Eng Appl Artif Intell. 2010;23(7):149–57.

    Article  Google Scholar 

  22. Saraswathi S, Sundaram S, Sundararajan N, Zimmermann M, Nilsen-Hamilton M. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented. IEEE/ACM Trans Comput Biol Bioinform. 2011;8(2):452–63.

    Article  PubMed  Google Scholar 

  23. Isaacson R, Fujita F. Metacognitive knowledge monitoring and self-regulated learning: academic success and reflections on learning. J Scholarsh Teach Learn. 2006;6(1):39–55.

    Google Scholar 

  24. Cambria E, Olsher D, Kwok, K. Sentic activation: A two-level affective common sense reasoning framework. In: Proceedings of association for the advancement of artificial intelligence conference, Toronto; 2012. p. 186–92.

  25. Savitha R, Suresh S, Sundararajan N. Meta-cognitive learning in fully complex-valued radial basis function network. Neural Comput. 2012;24(5):1297–28.

    Article  CAS  PubMed  Google Scholar 

  26. Savitha R, Suresh S, Sundararajan N. A meta-cognitive learning algorithm for a fully complex-valued relaxation network. Neural Netw. 2012;32:209–18.

    Article  CAS  PubMed  Google Scholar 

  27. Babu GS, Suresh S. Meta-cognitive RBF network and its projection based learning algorithm for classification problems. Appl Soft Comput. 2013;13(1):654–66.

    Article  Google Scholar 

  28. Babu GS, Suresh S. Sequential projection-based metacognitive learning in a radial basis function network for classification problems. IEEE Trans Neural Netw. 2012;24(2):194–206.

    Article  Google Scholar 

  29. Nelson TO, Narens L. Metacognition: core readings, ch. Metamemory: a theoretical framework and new findings: Allyn and Bacon: Boston, T. O. Nelson (ed.) ed., 1980. p. 9–24.

  30. Fernndez-Navarro F, Hervs-Martnez C, Gutirrez PA, Pea-Barragn JM, Lpez-Granados F. Parameter estimation of q-Gaussian Radial Basis Functions Neural Networks with a Hybrid Algorithm for binary classification. Neurocomputing. 2012;75:123–34.

    Article  Google Scholar 

  31. Zhang T. Statistical behavior and consistency of classification methods based on convex risk minimization. Ann Stat. 2003;32(1):56–85.

    Article  Google Scholar 

  32. Suresh S, Sundararajan N, Saratchandran P. Risk-sensitive loss functions for sparse multi-category classification problems. Inf Sci. 2008;178(12):2621–38.

    Article  Google Scholar 

  33. Blake C, Merz C. UCI repository of machine learning databases. Department of Information and Computer Sciences, University of California, Irvine, [http://archive.ics.uci.edu/ml/] 1998.

  34. Cox MT. Metacognition in computation: a selected research review. Artif Intell. 2005;169(2):104–41.

    Article  Google Scholar 

  35. Suresh S, Savitha R, Sundararajan N. A sequential learning algorithm for complex-valued resource allocation network-CSRAN. IEEE Trans Neural Netw. 2011;22(7):1061–72.

    Article  PubMed  Google Scholar 

  36. Savitha R, Suresh S, Sundararajan N. A fully complex-valued radial basis function network and its learning algorithm. Int J Neural Syst. 2009;19(4):253–67.

    Article  CAS  PubMed  Google Scholar 

  37. Goh SL, Mandic DP. An augmented extended Kalman filter algorithm for complex-valued recurrent neural networks. Neural Comput. 2007;19(4):1039–55.

    Article  PubMed  Google Scholar 

  38. Savitha R, Suresh S, Sundararajan N. A fast learning fully complex-valued relaxation network (FCRN). In: Proceedings of the international joint conference on neural networks 2011; p. 1372–7.

  39. Subramanian K, Suresh S. A sequential learning algorithm for meta-cognitive neuro-fuzzy inference system for classification problems. In: Proceedings of the international joint conference on neural networks 2011; p. 2507–12.

  40. Subramanian K, Suresh S. Human action recognition using meta-cognitive neuro-fuzzy inference system. Proc Int Jt Conf Neural Netw. 2012;22(6):1250028-1–15.

    Google Scholar 

  41. Subramanian K, Suresh S. A meta-cognitive sequential learning algorithm for neuro-fuzzy inference system. Appl Soft Comput. 2012;12(11):3603–14.

    Article  Google Scholar 

  42. Shkurko K, Qi X. A radial basis function and semantic learning space based composite learning approach to image retrieval. Proc ICASSP IEEE Int Conf Acoust Speech Signal Process. 2007;1:945–8.

    Google Scholar 

  43. Zhang J, Li H. A reconstruction approach to ct with cauchy rbfs network. Advances in Neural Networks: ISNN2004, Lecture Notes in Computer Science 2004; 3174:234–6.

  44. Saranli A, Baykal B. Complexity reduction in radial basis function (RBF) networks by using radial B-spline functions. Neurocomputing. 1998;18(1–3):183–94.

    Article  Google Scholar 

  45. Udwadia FE, Phohomsiri P. Generalized LM-inverse of a matrix augmented by a column vector. Appl Math Comput. 2007;190:999–06.

    Article  Google Scholar 

  46. Chang C.-C, Lin C.-J. LIBSVM: a library for support vector machines (software available at /http://www.csie.ntu.edu.tw/cjlin/libsvms). ACM Trans Intell Syst Technol 2011;2(27):1–27.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to H. J. Kim.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Savitha, R., Suresh, S. & Kim, H.J. A Meta-Cognitive Learning Algorithm for an Extreme Learning Machine Classifier. Cogn Comput 6, 253–263 (2014). https://doi.org/10.1007/s12559-013-9223-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-013-9223-2

Keywords

Navigation