Skip to main content

Energy-Time Profiling for Machine Learning Methods to EEG Classification

  • Conference paper
  • First Online:
Bioengineering and Biomedical Signal and Image Processing (BIOMESIP 2021)

Abstract

The Electroencephalography discipline studies a type of signals called Electroencephalograms (EEGs), which represent the electrical activity of different parts of the brain. EEGs are composed of a massive number of features that could be used to create an intelligent recognition system. Nevertheless, the high number of available features difficult the correct classification of the signals as most of them do not contain relevant information. The use of Feature Selection (FS) techniques allows the reduction of the number of features by extracting a reduced but powerful feature subset. Afterwards, the problem can be addressed by using both the reduced features along with classification methods. In this sense, this work proposes a comparison of five supervised classifiers to solve an Electroencephalography problem in the context of Motor Imagery-based BCI tasks. The predictive models are evaluated in terms of classification rate and performance (execution time and energy consumption), with the idea of determining which alternative offers the best trade-off among all the objectives.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Dertat, A.: Applied deep learning - part 4: convolutional neural networks. https://medium.com/@ardendertat

  2. Amra, I., Maghari, A.: Students performance prediction using KNN and naïve bayesian. In: 2017 8th International Conference on Information Technology. ICIT 2017, Amman, Jordan, pp. 909–913. IEEE, October 2017. https://doi.org/10.1109/ICITECH.2017.8079967

  3. Asensio-Cubero, J., Gan, J.Q., Palaniappan, R.: Multiresolution analysis over simple graphs for brain computer interfaces. J. Neural Eng. 10(4), 21–26 (2013). https://doi.org/10.1088/1741-2560/10/4/046014

    Article  Google Scholar 

  4. Bellman, R.E.: Adaptive Control Processes: A Guided Tour. Princeton University Press (1961)

    Google Scholar 

  5. Crellin, G.L.: The philosophy and mathematics of bayes’ equation. IEEE Trans. Reliab. \({\bf R-21}\), 131–135 (1972). https://doi.org/10.1109/TR.1972.5215975

  6. Cournapeau, D.: Machine learning in Python. https://scikit-learn.org/stable/. Accessed 15 Sept 2020

  7. Chollet, F.: The Python deep learning API. https://keras.io/. Accessed 25 Feb 2021

  8. Friedl, M.A., Brodley, C.E.: Decision tree classification of land cover from remotely sensed data. Remote Sens. Environ. 61, 399–409 (1997). https://doi.org/10.1016/S0034-4257(97)00049-7

  9. Google Brain Team: An end-to-end open source machine learning platform. https://www.tensorflow.org/. Accessed 25 Feb 2021

  10. Gurney, K.: An Introduction to Neural Networks. CRC Press (1997)

    Google Scholar 

  11. Hengl, T., Nussbaum, M., Wright, M.N., Heuvelink, G., Gräler, B.: Random forest as a generic framework for predictive modeling of spatial and spatio-temporal variables. PeerJ 6, e5518 (2018). https://doi.org/10.7717/peerj.5518

  12. Imandoust, S.B., Bolandraftar, M.: Application of k-nearest neighbor (KNN) approach for predicting economic events theoretical background. Int. J. Eng. Res. Appl. 3, 605–610 (2013)

    Google Scholar 

  13. Joyce, J.: Bayes’ theorem. https://stanford.library.sydney.edu.au/archives/sum2016/entries/bayes-theorem/#4

  14. Karamizadeh, S., Abdullah, S.M., Halimi, M., Shayan, J., Rajabi, M.J.: Advantage and drawback of support vector machine functionality. In: 2014 International Conference on Computer, Communications, and Control Technology, Langkawi, Malaysia. I4CT 2014, pp. 63–65. IEEE, September 2014. https://doi.org/10.1109/I4CT.2014.6914146

  15. Krupal, S.P., Trupti, P.S.: Support vector machine - a large margin classifier to diagnose skin illnesses. Procedia Technol. 23, 369–375 (2016). https://doi.org/10.1016/j.protcy.2016.03.039

    Article  Google Scholar 

  16. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature, 436–444 (2015). https://doi.org/10.1038/nature14539

  17. León, J., et al.: Deep learning for EEG-based motor imagery classification: accuracy-cost trade-off. PLoS ONE 15, 1–30 (2020). https://doi.org/10.1371/journal.pone.0234178

    Article  Google Scholar 

  18. Li, L.L.C.: Research and improvement of a spam filter based on naive bayes. In: 2015 7th International Conference on Intelligent Human-Machine Systems and Cybernetics, Hangzhou, China. IHMSC 2015, vol. 2, pp. 361–364. IEEE, November 2015. https://doi.org/10.1109/IHMSC.2015.208

  19. Milgram, J., Cheriet, M., Sabourin, R.: “one against one” or “one against all”: which one is better for handwriting recognition with SVMS? In: Tenth International Workshop on Frontiers in Handwriting Recognition, La Baule, France. IWFHR 2006. IEEE, October 2006

    Google Scholar 

  20. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005). https://doi.org/10.1109/TPAMI.2005.159

    Article  Google Scholar 

  21. Sakamoto, Y., Ishiguro, M., Kitagawa, G.: Akaike information criterion statistics. D. Reidel 81, 26853 (1986). https://doi.org/10.1080/01621459.1988.10478680

    Article  MATH  Google Scholar 

  22. Shastry, K.A., Sanjay, H.A.: Machine Learning for Bioinformatics, pp. 25–39. Springer Singapore (2020). https://doi.org/10.1007/978-981-15-2445-5_3

  23. Sha’abani, M.N.A.H., Fuad, N., Jamal, N., Ismail, M.F.: kNN and SVM classification for EEG: a review. In: Kasruddin Nasir, A.N., et al. (eds.) InECCE2019. LNEE, vol. 632, pp. 555–565. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-2317-5_47

    Chapter  Google Scholar 

  24. Yin, Z., Lan, H., Tan, G., Lu, M., Vasilakos, A.V., Liu, W.: Computing platforms for big biological data analytics: perspectives and challenges. Comput. Struct. Biotechnol. J. 15, 403–411 (2017). https://doi.org/10.1016/j.csbj.2017.07.004

    Article  Google Scholar 

Download references

Acknowledgments

This research has been funded by the Spanish Ministry of Science, Innovation, and Universities under grant PGC2018-098813-B-C31 and ERDF fund. We would like to thank the BCI laboratory of the University of Essex, especially Dr. John Q. Gan, for allowing us to use their datasets.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Juan Carlos Gómez-López .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gómez-López, J.C. et al. (2021). Energy-Time Profiling for Machine Learning Methods to EEG Classification. In: Rojas, I., Castillo-Secilla, D., Herrera, L.J., Pomares, H. (eds) Bioengineering and Biomedical Signal and Image Processing. BIOMESIP 2021. Lecture Notes in Computer Science(), vol 12940. Springer, Cham. https://doi.org/10.1007/978-3-030-88163-4_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88163-4_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88162-7

  • Online ISBN: 978-3-030-88163-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics