Skip to main content
Log in

Conditional Random Mapping for Effective ELM Feature Representation

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Extreme learning machine (ELM) has been extensively studied, due to its fast training and good generalization. Unfortunately, the existing ELM-based feature representation methods are uncompetitive with state-of-the-art deep neural networks (DNNs) when conducting some complex visual recognition tasks. This weakness is mainly caused by two critical defects: (1) random feature mappings (RFM) by ad hoc probability distribution is unable to well project various input data into discriminative feature spaces; (2) in the ELM-based hierarchical architectures, features from previous layer are scattered via RFM in the current layer, which leads to abstracting higher level features ineffectively. To address these issues, we aim to take advantage of label information for optimizing random mapping in the ELM, utilizing an efficient label alignment metric to learn a conditional random feature mapping (CRFM) in a supervised manner. Moreover, we proposed a new CRFM-based single-layer ELM (CELM) and then extended CELM to the supervised multi-layer learning architecture (ML-CELM). Extensive experiments on various widely used datasets demonstrate our approach is more effective than original ELM-based and other existing DNN feature representation methods with rapid training/testing speed. The proposed CELM and ML-CELM are able to achieve discriminative and robust feature representation, and have shown superiority in various simulations in terms of generalization and speed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Notes

  1. http://www.mathworks.com/matlabcentral/fileexchange/38310-deeplearning-toolbox

References

  1. Huang G-B, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(13):489–501.

    Article  Google Scholar 

  2. Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B Cybern. 2012;42(2):513.

    Article  Google Scholar 

  3. Savitha R, Suresh S, Kim HJ. A meta-cognitive learning algorithm for an extreme learning machine classifier. Cogn Comput. 2014;6(2):253–63.

    Article  Google Scholar 

  4. Huang G-B, Song S, You K. Trends in extreme learning machines: a review. Neural Netw Offic J Int Neural Netw Soc. 2015;61(C):32–48.

    Article  Google Scholar 

  5. Huang G-B. What are extreme learning machines? Filling the gap between Frank Rosenblatts dream and John von Neumanns puzzle. Cogn Comput. 2015;7:263–78.

    Article  Google Scholar 

  6. Huang G-B, Chen L. Letters: Convex incremental extreme learning machine. Neurocomputing. 2012;70(16):3056–62.

    Google Scholar 

  7. Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.

    Article  Google Scholar 

  8. Cao J, Zhang K, Luo M, Yin C, Lai X. Extreme learning machine and adaptive sparse representation for image classification. Neural Netw. 2016;81:91–102.

    Article  Google Scholar 

  9. Iosifidis A, Tefas A, Pitas I. Graph embedded extreme learning machine. IEEE Trans Cybern. 2016;46(1):311–24.

    Article  Google Scholar 

  10. Huang G-B, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern. 2011;2(2):107–22.

    Article  Google Scholar 

  11. Lin SB, Liu X, Fang J, Xu ZB. Is extreme learning machine feasible? A theoretical assessment (part ii). IEEE Trans Neural Netw Learn Syst. 2014;26(1):21–34.

    Article  Google Scholar 

  12. Wang XZ, Shao QY, Miao Q, Zhai JH. Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing. 2013;102(2):3–9.

    Google Scholar 

  13. Tang J, Deng C, Huang GB, Zhao B. Compressed-domain ship detection on spaceborne optical image using deep neural network and extreme learning machine. IEEE Trans Geosci Remote Sens. 2014;53(3):1174–85.

    Article  Google Scholar 

  14. Deng C, Wang S, Li Z, Huang G B, Lin W. Content-insensitive blind image blurriness assessment using Weibull statistics and sparse extreme learning machine. IEEE Trans Syst Man Cybern Syst. 2017;PP(99):1–12.

    Google Scholar 

  15. Gritsenko A, Akusok A, Baek S, Miche Y, Lendasse A. Extreme learning machines for visualization+r: mastering visualization with target variables. Cogn Comput. 2017;3:1–14.

    Google Scholar 

  16. Zhang Z, Zhao X, Wang G. Fe-elm: a new friend recommendation model with extreme learning machine. Cogn Comput. 2017;9(5):659–70.

    Article  Google Scholar 

  17. Wang B, Zhu R, Luo S, Yang X, Wang G. H-mrst: a novel framework for supporting probability degree range query using extreme learning machine. Cogn Comput. 2017;9(1):68–80.

    Article  Google Scholar 

  18. Liu H, Qin J, Sun F, Guo D. Extreme kernel sparse learning for tactile object recognition. IEEE Trans Cybern. 2017;47(12):4509–20.

    Article  Google Scholar 

  19. Vong CM, Ip WF, Chiu CC, Wong PK. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput. 2015;7(3):381–91.

    Article  Google Scholar 

  20. Mao WT, Jiang M, Wang J, Li Y. Online extreme learning machine with hybrid sampling strategy for sequential imbalanced data. Cogn Comput. 2017;9(6):1–21.

    Article  Google Scholar 

  21. Horata P, Chiewchanwattana S, Sunat K. Robust extreme learning machine. Neurocomputing. 2013; 102(2):31–44.

    Article  Google Scholar 

  22. Li K, Zhang J, Xu H, Luo S, Li H. A semi-supervised extreme learning machine method based on co-training. J Comput Inf Syst. 2013;9(1):207–14.

    Google Scholar 

  23. Huang G, Song S, Gupta J, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern. 2017;44(12):2405–17.

    Article  Google Scholar 

  24. Kasun LLC, Yang Y, Huang G-B, Zhang Z. Dimension reduction with extreme learning machine. IEEE Trans Image Process. 2016;25(8):3906–18.

    Article  Google Scholar 

  25. Tang J, Deng C, Huang G-B. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst. 2016;27(4):809–21.

    Article  Google Scholar 

  26. Johnson W, Lindenstrauss J. Extensions of Lipschitz mappings into a Hilbert space. 1982;26:189–206.

    Google Scholar 

  27. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. CVPR. 2016;2016:770–8.

    Google Scholar 

  28. Zhong G, Yan S, Huang K. Reducing and stretching deep convolutional activation features for accurate image classification. Cogn Comput. 2018;10:1–8.

    Article  Google Scholar 

  29. Wen G, Hou Z, Li H, Li D, Jiang L, Xun E. Ensemble of deep neural networks with probability-based fusion for facial expression recognition. Cogn Comput. 2017;9(5):597–10.

    Article  Google Scholar 

  30. Liu H, Wu Y, Sun F, Fang B, Guo D. Weakly-paired multi-modal fusion for object recognition. IEEE Trans Autom Sci Eng., In press, https://doi.org/10.1109/TASE.2017.2692271.

    Article  Google Scholar 

  31. Kasun LLC, Zhou H, Huang G-B, Wu C. Representational learning with extreme learning machine for big data. IEEE Intell Syst. 2013;28(6):31–4.

    Google Scholar 

  32. Yang Y, Wu QMJ. Multilayer extreme learning machine with subnetwork nodes for representation learning. IEEE Trans Cybern. 2016;46(11):2570–83.

    Article  Google Scholar 

  33. Rahimi A, Recht B. Random features for large-scale kernel machines. Int Conf Neural Inf Process Syst. 2007:1177–84.

  34. Cho Y, Saul LK. Kernel methods for deep learning. Adv Neural Inf Process Syst. 2012:342–50.

  35. Sinha A, Duchi J. Learning kernels with random features. Adv Neural Inf Process Syst. 2016:1298–306.

  36. Perez-Suay A, Amoros-Lopez J, Gomez-Chova L. Randomized kernels for large scale earth observation applications. Remote Sens Environ. 2017;202(3):54–63.

    Article  Google Scholar 

  37. Huang G-B, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.

    Article  Google Scholar 

  38. Vincent P, Larochelle H, Bengio Y, Manzagol PA. Extracting and composing robust features with denoising autoencoders. Int Conf Mach Learn. 2008:1096–103.

  39. Lecun Y, Kavukcuoglu K, Farabet C. Convolutional networks and applications in vision. IEEE Int Symp Circuits Syst. 2010:253–6.

  40. Liu X, Gao C, Li P. A comparative analysis of support vector machines and extreme learning machines. Elsevier Science Ltd. 2012.

  41. Lcun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc IEEE. 2001;86(11):2278–324.

    Article  Google Scholar 

  42. Nene SA, Nayar SK, Murase H. 1996. Columbia object image library (COIL-20) Technical Report CUCS-005-96.

  43. Leibe B, Schiele B. Analyzing appearance and contour based methods for object categorization, CVPR 2003. 2003. p. II–409–15 vol. 2.

  44. Lecun Y, Huang FJ, Bottou L. Learning methods for generic object recognition with invariance to pose and lighting, CVPR 2004. 2004. p. II–97–104 vol. 2.

  45. Blake CL, Merz CJ. 1998. UCI Repository of machine learning databases. Dept. Inf. Comput. Sci., Univ. California, Irvine.

  46. Larochelle H, Erhan D, Courville A, Bergstra J, Bengio Y. An empirical evaluation of deep architectures on problems with many factors of variation. Int Conf Mach Learn. 2007:473–80.

  47. Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006; 313(5786):504.

    Article  CAS  Google Scholar 

  48. Hinton GE, Osindero S, Teh YW. 2006. A fast learning algorithm for deep belief nets. MIT Press.

  49. Zhang J, Ding S, Zhang N, Xue Y. Weight uncertainty in Boltzmann machine. Cogn Comput 2016;8(6):1064–73.

    Article  Google Scholar 

  50. Kavukcuoglu K, Boureau YL, Boureau YL, Gregor K, Lecun Y. Learning convolutional feature hierarchies for visual recognition. Int Conf Neural Inf Process Syst. 20010:1090–8.

Download references

Funding

This work was supported by the National Natural Science Foundation of China (NSFC) under Grant 91438203.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenwei Deng.

Ethics declarations

Conflict of Interest

All authors declare that they have no conflict of interest.

Informed Consent

Informed consent was not required as no human or animals were involved.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, C., Deng, C., Zhou, S. et al. Conditional Random Mapping for Effective ELM Feature Representation. Cogn Comput 10, 827–847 (2018). https://doi.org/10.1007/s12559-018-9557-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-018-9557-x

Keywords

Navigation