Skip to main content

Advertisement

Log in

An EEG-Based BCI System to Facial Action Recognition

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

Brain–computer interface (BCI) has created a new era in neuroscience. It has improved the life quality of severely disabled patients. It allows them to regain the power of executing will by their cognitive, expressive and affective brain activities. An electroencephalogram (EEG)-based BCI system with wireless manner was developed to extract EEG signals with Emotiv EPOC head set for recognizing the facial actions in this paper. The extracted feature vectors of EEG can be reduced by the Wavelet transform. Then the reduced EEG signals can then be clearly classified into six clusters by means of support vector machine algorithm with Gaussian kernel function. The better correct rates can be obtained by one-order wavelet transform than those got by three-order wavelet transform. In order to get real-time manner to control an electronic system smoothly, the sampling data have to be reduced. If time consumption is considered, we can choice the one-order wavelet transform with 32 samples. The experimental results showed a promising correct rate for the facial-action recognition through the proposed BCI system with real-time manner.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Lin, J.-S., Wang, M., Lia, P.-Y., & Li, Z. (2014). An SSVEP-based BCI system for SMS in a mobile phone. Applied Mechanics and Materials, 513–517, 412–415.

    Article  Google Scholar 

  2. Pfurtscheller, G., Müller, G. R., Pfurtscheller, J., Gerner, H. J., & Rupp, R. (2003). Thought-control of functional electrical stimulation to restore hand grasp in a patient with tetraplegia. Neuroscience Letters, 351, 33–36.

    Article  Google Scholar 

  3. Lin, J.-S., Wang, M., & Hsieh, C.-H. (2014). An SSVEP-based BCI system with SOPC platform for electric wheelchairs. Transactions on Computer Science and Technology, 3(2), 35–40.

    Google Scholar 

  4. Lin, J.-S., & Huang, S.-M. (2013). An FPGA-based brain–computer interface for wireless electric wheelchairs. Applied Mechanics and Materials, 283–287, 1616–1621.

    Article  Google Scholar 

  5. Bos Nijholt, D. P. O., & Reuderink, B. (2009). Turning shortcomings into challenges: Brain–computer interfaces for games. Entertainment Computing, 1(2), 85–94.

    Article  Google Scholar 

  6. Mühl, H., Gürkök, H. D., Bos, P. O., Thurlings, M., Scherffig, L., Duvinage, M., et al. (2010). Bacteria Hunt. Journal of Multimodal User Interfaces, 4, 11–25.

    Article  Google Scholar 

  7. Hal, V., Rhodes, S., Dunne, B., & Bossemeyer, R. (2014). Low-cost EEG-based sleep detection. In Proceedings of the 36th IEEE annual international conference (pp. 4571–4574).

  8. Thobbi, A., Kadam, R., & Sheng, W. (2010). Achieving remote presence using a humanoid robot controlled by a non-invasive BCI device. International Journal on Artificial Intelligence and Machine Learning, 10, 41–45.

    Google Scholar 

  9. Szarfir, A., & Signorile, R. (2011). An exploration of the utilization of electroencephalography and neural nets to control robots. Proceedings of Human–Computer Interaction-INTERACT, 2011, 186–194.

    Google Scholar 

  10. Ramirez, R., & Vamvakousis, Z. (2012). Detecting emotion from EEG signals using the Emotive Epoc device. Lecture Notes in Computer Science, 7670, 175–184.

    Article  Google Scholar 

  11. Duvinage, M., Castermans, T., Petieau, M., Hoellinger, T., Cheron, G., & Dutoit, T. (2013). Performance of the Emotiv EPOC headset for P300-based applications. BioMedical Engineering OnLine, 12, 56. http://www.biomedical-engineering-online.com/content/12/1/56.

  12. Fraga, T., Pichiliani, M., & Louro, D. (2013). Experimental art with brain controlled interface. Lecture Notes in Computer Science, 8009, 642–651.

    Article  Google Scholar 

  13. Khushaba, R. N., Wise, C., Kodagoda, S., Louviere, J., Kahn, B. E., & Townsend, C. (2013). Consumer neuroscience: Assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking. Expert Systems with Applications, 40(9), 3803–3812.

    Article  Google Scholar 

  14. Liu, Y., Jiang, X., Cao, T., & Wan, F. (2012). Implementation of SSVEP based BCI with Emotiv EPOC. In Virtual environments humancomputer interfaces and measurement systems (VECIMS) (pp. 34–37).

  15. Grude, S., Freeland, M., Yang, C., & Ma, H. (2013). Controlling mobile Spykee robot using Emotiv Nero headset. In Proceedings of the 32nd Chinese control conference (pp. 5927–5932).

  16. Vetterli, M., & Kovacevic, J. (1995). Wavelets and subband coding. London: Prentice Hall.

    MATH  Google Scholar 

  17. Hajibabazadeh, M., & Azimirad, V. (2014). Brain–robot interface: Distinguishing left and right hand EEG signals through SVM. In 2014 Second RSI/ISM international conference on robotics and mechatronics (pp. 813–816).

Download references

Acknowledgments

In this paper, the research was sponsored by the Ministry of Science and Technology of Taiwan under the G

rant NSC103-2221-E-167-027.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jzau-Sheng Lin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, JS., Jiang, ZY. An EEG-Based BCI System to Facial Action Recognition. Wireless Pers Commun 94, 1579–1593 (2017). https://doi.org/10.1007/s11277-016-3700-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-016-3700-3

Keywords

Navigation