Skip to main content
Log in

Gauss–Seidel Extreme Learning Machines

  • Original Research
  • Published:
SN Computer Science Aims and scope Submit manuscript

Abstract

Extreme learning machines (ELM) were created to simplify the training phase of single-layer feedforward neural networks, where the input weights are randomly set and the only parameter is the number of neurons in the hidden layer. These networks are also known for one-shot training using Moore–Penrose pseudo-inverse. In this work, we propose Gauss–Seidel extreme learning machine (GS-ELM), an ELM based on Gauss–Seidel iterative method to solve linear equation systems. We performed tests considering databases with different characteristics and analysed its discrimination capabilities and memory consumption in comparison to the canonical ELM and the online sequential ELM. GS-ELM presented similar discrimination capabilities, but consuming significantly less memory, turning possible its application in low-memory systems and embedded solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Moore GE. Cramming more components onto integrated circuits. Electronics. 1965;38(8):114–7.

    Google Scholar 

  2. Wang N, Lao K, Zhang X. Design and myoelectric control of an anthropomorphic prosthetic hand. J Bionic Eng. 2017;14(1):47–59.

    Google Scholar 

  3. Xu K, Guo W, Hua L, Sheng X, Zhu X. A prosthetic arm based on EMG pattern recognition. In: 2016 IEEE international conference on robotics and biomimetics, ROBIO 2016. 2016. p. 1179–1184. 10.1109/ROBIO.2016.7866485.

  4. Sudarsan S, Sekaran EC, et al. Design and development of emg controlled prosthetics limb. Proc Eng. 2012;38:3547–51.

    Google Scholar 

  5. Brunelli D, Tadesse AM, Vodermayer B, Nowak M, Castellini C. Low-cost wearable multichannel surface emg acquisition for prosthetic hand control. In: Advances in sensors and interfaces (IWASI), 2015 6th IEEE international workshop on IEEE. 2015. p. 94–99.

  6. Pasquina PF, Evangelista M, Carvalho AJ, Lockhart J, Griffin S, Nanos G, et al. First-inman demonstration of a fully implanted myoelectric sensors system to control an advanced electromechanical prosthetic hand. J Neurosci Methods. 2015;244:85–93.

    Google Scholar 

  7. Khan Y, Ostfeld AE, Lochner CM, Pierre A, Arias AC. Monitoring of vital signs with flexible and wearable medical devices. Adv Mater. 2016;28(22):4373–95.

    Google Scholar 

  8. Wang HC, Lee AR. Recent developments in blood glucose sensors. J Food Drug Anal. 2015;23(2):191–200.

    Google Scholar 

  9. Benson LC, Clermont CA, Osis ST, Kobsar D, Ferber R. Classifying running speed conditions using a single wearable sensor: optimal segmentation and feature extraction methods. J Biomech. 2018;71:94–9.

    Google Scholar 

  10. Geng Y, Chen J, Fu R, Bao G, Pahlavan K. Enlighten wearable physiological monitoring systems: on-body rf characteristics based human motion classification using a support vector machine. IEEE Trans Mob Comput. 2016;15(3):656–71.

    Google Scholar 

  11. Alsamhi S, Ma O, Ansari M. Artificial intelligence-based techniques for emerging robotics communication: a survey and future perspectives. 2018. arXiv:180409671.

  12. Huang JJh, Yong C, Li HT, Yamakawa DA, Dos Santos JR. Systems and methods for image recognition using mobile devices. 2015. US Patent 9,195,898.

  13. Kubat M. Artificial neural networks. In: An introduction to machine learning. Springer; 2015. p. 91–111.

  14. Hopfield JJ. Artificial neural networks. IEEE Circuits Devices Mag. 1988;4(5):3–10.

    Google Scholar 

  15. Hearst MA, Dumais ST, Osuna E, Platt J, Scholkopf B. Support vector machines. IEEE Intell Syst Appl. 1998;13(4):18–28.

    Google Scholar 

  16. Riffenburgh RH. Linear discriminant analysis. Ph.D. thesis. Virginia Polytechnic Institute. 1957.

  17. Meisel WS. Computer-oriented approaches to pattern recognition. Tech. Rep. Technology service Corp, Santa Monica, 1972.

  18. Menon R, Di Caterina G, Lakany H, Petropoulakis L, Conway BA, Soraghan JJ. Study on interaction between temporal and spatial information in classification of EMG signals for myoelectric prostheses. IEEE Trans Neural Syst Rehabil Eng. 2017;25(10):1832–42. https://doi.org/10.1109/TNSRE.2017.2687761.

    Article  Google Scholar 

  19. Larose DT. k-nearest neighbor algorithm. In: Discovering knowledge in data: an introduction to data mining. 2005. p. 90–106.

  20. Cover T, Hart P. Nearest neighbor pattern classification. IEEE Trans Inf Theory. 1967;13(1):21–7.

    MATH  Google Scholar 

  21. Kim KS, Choi HH, Moon CS, Mun CW. Comparison of k-nearest neighbor, quadratic discriminant and linear discriminant analysis in classification of electromyogram signals based on the wrist-motion directions. Curr Appl Phys. 2011;11(3):740–5.

    Google Scholar 

  22. Pao YH, Takefuji Y. Functional-link net computing: theory, system architecture, and functionalities. Computer. 1992;25(5):76–9.

    Google Scholar 

  23. Schmidt WF, Kraaijveld MA, Duin RP. Feedforward neural networks with random weights. In: 11th IAPR international conference on pattern recognition. Vol. II. Conference B: pattern recognition methodology and systems. IEEE. 1992. p. 1–4.

  24. Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol Rev. 1958;65(6):386.

    Google Scholar 

  25. Huang GB, Zhu QY, Siew CK. Extreme learning machine: a new learning scheme of feedforward neural networks. In: Neural networks, 2004. Proceedings. 2004 IEEE international joint conference, vol. 2. 2004. p. 985–990.

  26. Barata JCA, Hussein MS. The moore–penrose pseudoinverse: a tutorial review of the theory. Braz J Phys. 2012;42(1):146–65. https://doi.org/10.1007/s13538-011-0052-z.

    Article  Google Scholar 

  27. Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1):489–501.

    Google Scholar 

  28. Huang GB, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybernetics). 2012;42(2):513–29.

    Google Scholar 

  29. Huang GB, Chen L. Convex incremental extreme learning machine. Neurocomputing. 2007;70(16–18):3056–62.

    Google Scholar 

  30. Tang J, Deng C, Huang GB. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst. 2016;27(4):809–21.

    MathSciNet  Google Scholar 

  31. Mesquita DP, Gomes J, Rodrigues LR, Galvao RK. Pruning extreme learning machines using the successive projections algorithm. IEEE Latin Am Trans. 2015;13(12):3974–9.

    Google Scholar 

  32. Huang G, Huang GB, Song S, You K. Trends in extreme learning machines: a review. Neural Netw. 2015;61:32–48.

    MATH  Google Scholar 

  33. Cao W, Wang X, Ming Z, Gao J. A review on neural networks with random weights. Neurocomputing. 2018;275:278–87.

    Google Scholar 

  34. Huang GB, Chen L, Siew CK, et al. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.

    Google Scholar 

  35. Huang GB. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cognit Comput. 2015;7(3):263–78.

    Google Scholar 

  36. Lu J, Zhao J, Cao F. Extended feed forward neural networks with random weights for face recognition. Neurocomputing. 2014;136:96–102.

    Google Scholar 

  37. de Lima SM, da Silva-Filho AG, dos Santos WP. Detection and classification of masses in mammographic images in a multi-kernel approach. Comput Methods Programs Biomed. 2016;134:11–29.

    Google Scholar 

  38. Azevedo WW, Lima SM, Fernandes IM, Rocha AD, Cordeiro FR, da Silva-Filho AG, et al. Fuzzy morphological extreme learning machines to detect and classify masses in mammograms. In: 2015 IEEE international conference on fuzzy systems (fuzz-IEEE). IEEE. 2015. p. 1–8.

  39. Cordeiro FR, Lima SM, Silva-Filho AG, Santos W. Segmentation of mammography by applying extreme learning machine in tumor detection. In: International conference on intelligent data engineering and automated learning. Springer; 2012. p. 92–100.

  40. Santana MA, Pereira JMS, Silva FL, Lima NM, Sousa FN, Arruda GMS, et al. Breast cancer diagnosis based on mammary thermography and extreme learning machines. Res Biomed Eng. 2018;34(1):45–53.

    Google Scholar 

  41. de Lima SM, da Silva-Filho AG, dos Santos WP. A methodology for classification of lesions in mammographies using zernike moments, elm and svm neural networks in a multi-kernel approach. In: 2014 IEEE international conference on systems, man, and cybernetics (SMC). IEEE. 2014. p. 988–91.

  42. dos Santos MM, da Silva Filho AG, dos Santos WP. Deep convolutional extreme learning machines: Filters combination and error model validation. Neurocomputing. 2019;329:359–69.

    Google Scholar 

  43. He YL, Wang XZ, Huang JZ. Fuzzy nonlinear regression analysis using a random weight network. Inf Sci. 2016;364:222–40.

    MATH  Google Scholar 

  44. Goel T, Nehra V, Vishwakarma VP. An adaptive non-symmetric fuzzy activation function based extreme learning machines for face recognition. Arab J Sci Eng. 2017;42(2):805–16.

    Google Scholar 

  45. Cambuim LF, Macieira RM, Neto FM, Barros E, Ludermir TB, Zanchettin C. An efficient static gesture recognizer embedded system based on elm pattern recognition algorithm. J Syst Architect. 2016;68:1–16.

    Google Scholar 

  46. Azad NL, Mozaffari A, Fathi A. An optimal learning-based controller derived from Hamiltonian function combined with a cellular searching strategy for automotive coldstart emissions. Int J Mach Learn Cybern. 2017;8(3):955–79.

    Google Scholar 

  47. de Freitas RC, Alves R, da Silva Filho AG, de Souza RE, Bezerra BL, dos Santos WP. Electromyography-controlled car: a proof of concept based on surface electromyography, extreme learning machines and low-cost open hardware. Comput Electr Eng. 2019;73:167–79.

    Google Scholar 

  48. Wang DD, Wang R, Yan H. Fast prediction of protein–protein interaction sites based on extreme learning machines. Neurocomputing. 2014;128:258–66.

    Google Scholar 

  49. Huang GB, Liang NY, Rong HJ, Saratchandran P, Sundararajan N. On-line sequential extreme learning machine. Comput Intelli. 2005;2005:232–7.

    Google Scholar 

  50. Lan Y, Soh YC, Huang GB. Ensemble of online sequential extreme learning machine. Neurocomputing. 2009;72(13–15):3391–5.

    Google Scholar 

  51. Zong W, Huang GB. Face recognition based on extreme learning machine. Neurocomputing. 2011;74(16):2541–51.

    Google Scholar 

  52. Wang SJ, Chen HL, Yan WJ, Chen YH, Fu X. Face recognition and micro-expression recognition based on discriminant tensor subspace analysis plus extreme learning machine. Neural Process Lett. 2014;39(1):25–43.

    Google Scholar 

  53. Yang W, Wang Z, Sun C. A collaborative representation based projections method for feature extraction. Pattern Recognit. 2015;48(1):20–7.

    Google Scholar 

  54. Liang NY, Huang GB, Saratchandran P, Sundararajan N. A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw. 2006;17(6):1411–23.

    Google Scholar 

  55. Asuncion A, Newman DH, Bache K, Lichman M. UCI Machine Learning Repository. Meta. 2003.

  56. Hoang MTT, Huynh HT, Vo NH, Won Y. A robust online sequential extreme learning machine. In: International symposium on neural networks. Springer; 2007. p. 1077–86.

  57. McDermott J, Forsyth RS. Diagnosing a disorder in a classification benchmark. Pattern Recognit Lett. 2016;73:41–3.

    Google Scholar 

  58. Nene SA, Nayar SK, Murase H, et al. Columbia object image library (coil-20). 1996.

  59. Sun K, Zhang J, Zhang C, Hu J. Generalized extreme learning machine autoencoder and a new deep neural network. Neurocomputing. 2017;230:374–81.

    Google Scholar 

Download references

Acknowledgements

We would like to express our very great appreciation to the Brazilian research funding agencies CNPq and CAPES, for the partial financial support.

Funding

Funding received from Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (BR) (scholarhips PPGEC-UPE-2017); CNPq-Brazil (Grant 314896/2018-0).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wellington P. dos Santos.

Ethics declarations

Conflict of interest

This paper’s authors certify that they have no affiliations with, or involvement in any organization or entity with any financial interest, as also have no financial interest in the subject matter or materials discussed in this manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: Tables of Memory Consumption

Appendix: Tables of Memory Consumption

This section aims to show, in more details, the results of memory consumption for every configuration of GS-ELM, ELM and OS-ELM, considering every database (Tables 26, 27, 28, 29, 30). They are explained and further explored in Results section.

Table 26 Memory consumption for all configurations on Iris database
Table 27 Memory consumption for all configurations on Glass database
Table 28 Memory (MB) consumption for all configurations on Coil–20–Unproc database
Table 29 Memory consumption for all configurations on Wine database
Table 30 Memory consumption for all configurations on Bupa database

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

de Freitas, R.C., Ferreira, J., de Lima, S.M.L. et al. Gauss–Seidel Extreme Learning Machines. SN COMPUT. SCI. 1, 220 (2020). https://doi.org/10.1007/s42979-020-00232-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42979-020-00232-w

Keywords

Navigation