Skip to main content

Advertisement

Log in

Hybrid sensing and encoding using pad phone for home robot control

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

For the patients with limb disorder to control the home robot movement, the human intention sensing and encoding are the two important tasks. This paper focuses on a new augmented reality brain computer interface (ARBCI) of the stable state visual evoked potential (SSVEP), the human intention recognition algorithms using SSVEP and Electra-hologram (EOG) respectively, and the encoding design of the human intentions. Firstly, the new ARBCI is developed which includes the SSVEP collector and a specific environment augmented reality stimulator of the symbols of the robot operations. Furthermore, the robot control instructions are encoded. Secondly, the sliding window superposition-average algorithm (SWSA) is proposed for human intention recognition on the basis of SSVEP. The stimulation frequency feature from the augmented reality stimulator is extracted by using SWSA to control the power supply and the robot speed. Thirdly, the intentional blinking EOG threshold is defined according to the experiments. Then, a fusion recognition (FR) algorithm of amplitude and sampling time is developed on the basis of EOG, which is for the home robot direction controls. It is experimentally proved that the ARBCI improves the eye comfort compared with that of the original BCI stimulator. Besides, the SWSA can save 4 s to sensing a SSVEP intention meanwhile keep the same recognition accuracy compared with the traditional superposition-average method. In addition, the human intention sensing accuracy can reach 100% by using ARBCI and SWSA and the FR if the sensing time is adequate. A EOG intention sensing time is about 0.5 s by using the FR algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Abdullah JH, Al G, Nasir AA, Malik AA (2015) Sugeno fuzzy PID tuning by genetic-neutral for AVR in electrical power generation. Appl Soft Comput 28:226–236

    Article  Google Scholar 

  2. Chae Y, Jeong J, Jo S (2012) Toward brain-actuated humanoid robots asynchronous direct control using an EEG-based BCI.IEEE Trans on robot 25:11131-11144

  3. Diez F, Mut A, Avila Perona E et al (2011) Asynchronous BCI control using high-frequency SSVEP. J Neuro Engineering Rehabil 8:39–46

    Article  Google Scholar 

  4. Doron F, Leeb R, Pfurtscheller G, Slater M (2010) Human-computer interface issues in controlling virtual reality with brain-computer interface. Hum Comput Interact 25:67–94

    Article  Google Scholar 

  5. Huimin L, Yujie L, Shota N (2015) Single image dehazing through improved atmospheric light estimation. Multimedia Tools Appl 75(24):17081–17096

    Google Scholar 

  6. Huimin L, Yujie L, Xing X (2016) Underwater image enhancement method using weighted guided trigonometric filtering and artificial light correction. J Vis Commun Image Represent 38:504–516

    Article  Google Scholar 

  7. Jiaxin M, Yu Z, Cichocki A, Matsuno F (2015) A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to Robot Control. IEEE Trans Biomed Eng 62:876–889

  8. Jung-Hoon K, Sun L, Il-Kyun J (2012) Ether CAT based parallel robot control system. Robot Intell Technol Appl 208:375–382

    Google Scholar 

  9. Lu H, Li B, Zhu J et al (2016) Wound intensity correction and segmentation with convolutional neural networks. Practice and Experience, Concurrency and Computation. doi:10.1002/cpe.3927

  10. Luzheng B, Xin’an F, Nini L, Ke J, Yun L, Yili L (2013) A head-up display-based P300 brain-computer interface for destination selection. IEEE Trans Intell Transp Syst 14:1996–2001

    Article  Google Scholar 

  11. Luzheng B, Xin’an F, Teng T, Hongsheng D, Yili L (2014) Using a head-up display based steady state visual evoked potentials brain-computer interface to control a simulated vehicle. IEEE Trans Intell Transp Syst 15:959–966

    Article  Google Scholar 

  12. Masataka Y, Chi Z, Kazuyuki I, Feng W, Haoyong Y (2014) Experimental design and signal selection for construction of a robot control system based on EEG signals. Robot Biomim 1:22–33

    Article  Google Scholar 

  13. Pan J, Li Y, Gu Z, Yu Z (2013) A comparison study of two P300 speller paradigms for brain–computer interface. Cogn Neurodyn 7:523–529

    Article  Google Scholar 

  14. Shih Chung C, Chih Hung H, Hsuan Chia K (2014) The BCI control applied to the interactive autonomous robot with the function of meal assistance. Lect Notes Elect Eng 345:475–483

    Google Scholar 

  15. Volosyak I, Valbuena D, Lüth T, Malechka T, Gräser A (2011) BCI demographics II: how many (and what kinds of) people can use a high-frequency SSVEP BCI? IEEE Trans Neural Syst Rehabil Eng 19:232–239

    Article  Google Scholar 

  16. Xinyu W, Chenguang Y, Zhaojie J, Hongbin M, Mengyin F (2016) Robot manipulator self-identification for sursegmenting obstacle detection. Multimed Tools Appl. doi:10.1007/s11042-016-3275-8

  17. Yuchae J, Yong-IK Y (2016) Multi-level assessment model for wellness service based on human mental stress level. Multimed Tools Appl 5:1–13

    Google Scholar 

  18. Yujie L, Huimin L, Jianru L (2016) Underwater image de-scattering and classification by deep neural network. Comput Electr Eng 54:68–77

    Article  Google Scholar 

  19. Zolotukhin Y, Kotov K, Maltsev A (2011) Correction of transportation lag in the mobile robot control system. Optoelectronics Instrum Data Process 47:141–150

    Article  Google Scholar 

Download references

Acknowledgments

This research was sponsored by the Natural Science Foundation of China (51405381), Key Scientific and Technological Project of Shaanxi Province (2016GY-040), and the Science Foundation of Xi’an University of Science and Technology (104-6319900001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wen-Yuan Chen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, M., Qu, W. & Chen, WY. Hybrid sensing and encoding using pad phone for home robot control. Multimed Tools Appl 77, 10773–10786 (2018). https://doi.org/10.1007/s11042-017-4871-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-4871-y

Keywords

Navigation