Skip to main content

The Analysis of Concurrent-Task Operation Ability: Peripheral-Visual-Guided Grasp Performance Under the Gaze

  • Conference paper
  • First Online:
Intelligent Robotics and Applications (ICIRA 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13014))

Included in the following conference series:

  • 3357 Accesses

Abstract

Gazing at the target is a necessary part of human-machine interaction, providing the basic scene information, such as watching videos, handling documents, manufacturing and assembling workpieces. In these situations, for accomplishing a concurrent-task (such as getting a drink, taking a file, picking up a tool), a peripheral-visual-guided grasp is needed. Compared with the natural grasp without visual restrictions, the peripheral-visual-guiding will limit the grasp performance. However, few studies focus on the quantitative investigation of human concurrent-task operation ability. In this paper, we built a general concurrent-task scene: peripheral-visual-guided grasp under gazing at a target. Ten voluntary participants were required to keep their eyes gazing at the screen target and accomplish the concurrent grasp task in three restrictions: (1) keep both head and trunk still, (2) keep trunk still while the head can turn, (3) both trunk and head can turn. The maximum range of object placement able to accomplish accurate grasp, is defined as the workspace in each restriction. The rotation angle of head and trunk, and grasped object position, were collected using IMU and VICON. Three precision grasp workspace were obtained within the range of 44.631 \(\pm \) 1.348º, 67.315 \(\pm \) 2.075º, and 80.835 \(\pm \) 4.360º, corresponding to each restriction. Besides, the workspace division results were further verified by the head and trunk peak rotation angles. The results can be applied to provide a design guideline of assistive devices to enhance human concurrent-task operation ability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Navarro, J., Hernout, E., Osiurak, F., et al.: On the nature of eye-hand coordination in natural steering behavior. PLoS ONE 15(11), 0242818 (2020)

    Article  Google Scholar 

  2. Rossetti, Y., Desmurget, M., Prablanc, C.: Vectorial coding of movement: vision, proprioception, or both? J. Neurophysiol. 74(1), 457–463 (1995)

    Article  Google Scholar 

  3. Smirnov, A.S., Alikovskaia, T.A., Ermakov, P.N., et al.: Dart throwing with the open and closed eyes: kinematic analysis. Comput. Math. Methods Med. 2019(2), 1–10 (2019)

    Article  Google Scholar 

  4. Maturi, K.S., Sheridan, H.: Expertise effects on attention and eye-movement control during visual search: evidence from the domain of music reading. Atten. Percept. Psychophys. 82(5), 2201–2208 (2020)

    Article  Google Scholar 

  5. Wang, Z., Zheng, R., Kaizuka, T., Nakano, K.: Relationship between gaze behavior and steering performance for driver–automation shared control: a driving simulator study. IEEE Trans. Intell. Vehicles 4(1), 154–166 (2019)

    Article  Google Scholar 

  6. Kober, H., Wager, T.D.: Meta-analysis of neuroimaging data. WIREs Cogn. Sci. 1(2), 293–300 (2010)

    Article  Google Scholar 

  7. Gene-Sampedro, A., Alonso, F., Sánchez-Ramos, C., et al.: Comparing oculomotor efficiency and visual attention between drivers and non-drivers through the Adult Developmental Eye Movement (ADEM) test: a visual-verbal test. PLoS ONE 16(2), 0246606 (2021)

    Article  Google Scholar 

  8. Roux-Sibilon, A., Trouilloud, A., Kauffmann, L., et al.: Influence of peripheral vision on object categorization in central vision. J. Vis. 19(14), 7 (2019)

    Article  Google Scholar 

  9. Vater, C., Williams, A.M., Hossner, E.-J.: What do we see out of the corner of our eye? The role of visual pivots and gaze anchors in sport. Int. Revi. Sport Exerc. Psychol. 13(1), 81–103 (2019)

    Article  Google Scholar 

  10. Wolfe, B., Dobres, J., Rosenholtz, R., et al.: More than the useful field: considering peripheral vision in driving. Appl. Ergon. 65, 316–325 (2017)

    Article  Google Scholar 

  11. Marple-Horvat, D.E., Chattington, M., Anglesea, M., et al.: Prevention of coordinated eye movements and steering impairs driving performance. Exp. Brain Res. 163(4), 411–420 (2005)

    Article  Google Scholar 

  12. Bright, L.: Supernumerary robotic limbs for human augmentation in overhead assembly tasks. Robot. Sci. Syst. (2017)

    Google Scholar 

  13. Suzuki, M., Izawa, A., Takahashi, K., et al.: The coordination of eye, head, and arm movements during rapid gaze orienting and arm pointing. Exp. Brain Res. 184(4), 579–585 (2008)

    Article  Google Scholar 

  14. Nakabayashi, K., Iwasaki, Y., Iwata, H.: Development of evaluation indexes for human-centered design of a wearable robot arm. In: International Conference, pp. 305–310 (2017)

    Google Scholar 

  15. Guggenheim, J., Hoffman, R., Song, H., et al.: Leveraging the human operator in the design and control of supernumerary robotic limbs. IEEE Robot. Autom. Lett. 5(2), 2177–2184 (2020)

    Article  Google Scholar 

Download references

Acknowledgement

This work was supported in part by the National Natural Science Foundation of China (51905375), the China Post-doctoral Science Foundation Funded Project (2019M651033), Foundation of State Key Laboratory of Robotics and System (HIT) (SKLRS-2019-KF-06), and Peiyang Elite Scholar Program of Tianjin University (2020XRG-0023).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kuo Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y., Zhang, W., Zeng, B., Zhang, K., Cheng, Q., Ming, D. (2021). The Analysis of Concurrent-Task Operation Ability: Peripheral-Visual-Guided Grasp Performance Under the Gaze. In: Liu, XJ., Nie, Z., Yu, J., Xie, F., Song, R. (eds) Intelligent Robotics and Applications. ICIRA 2021. Lecture Notes in Computer Science(), vol 13014. Springer, Cham. https://doi.org/10.1007/978-3-030-89098-8_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-89098-8_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-89097-1

  • Online ISBN: 978-3-030-89098-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics