Abstract
Gazing at the target is a necessary part of human-machine interaction, providing the basic scene information, such as watching videos, handling documents, manufacturing and assembling workpieces. In these situations, for accomplishing a concurrent-task (such as getting a drink, taking a file, picking up a tool), a peripheral-visual-guided grasp is needed. Compared with the natural grasp without visual restrictions, the peripheral-visual-guiding will limit the grasp performance. However, few studies focus on the quantitative investigation of human concurrent-task operation ability. In this paper, we built a general concurrent-task scene: peripheral-visual-guided grasp under gazing at a target. Ten voluntary participants were required to keep their eyes gazing at the screen target and accomplish the concurrent grasp task in three restrictions: (1) keep both head and trunk still, (2) keep trunk still while the head can turn, (3) both trunk and head can turn. The maximum range of object placement able to accomplish accurate grasp, is defined as the workspace in each restriction. The rotation angle of head and trunk, and grasped object position, were collected using IMU and VICON. Three precision grasp workspace were obtained within the range of 44.631 \(\pm \) 1.348º, 67.315 \(\pm \) 2.075º, and 80.835 \(\pm \) 4.360º, corresponding to each restriction. Besides, the workspace division results were further verified by the head and trunk peak rotation angles. The results can be applied to provide a design guideline of assistive devices to enhance human concurrent-task operation ability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Navarro, J., Hernout, E., Osiurak, F., et al.: On the nature of eye-hand coordination in natural steering behavior. PLoS ONE 15(11), 0242818 (2020)
Rossetti, Y., Desmurget, M., Prablanc, C.: Vectorial coding of movement: vision, proprioception, or both? J. Neurophysiol. 74(1), 457–463 (1995)
Smirnov, A.S., Alikovskaia, T.A., Ermakov, P.N., et al.: Dart throwing with the open and closed eyes: kinematic analysis. Comput. Math. Methods Med. 2019(2), 1–10 (2019)
Maturi, K.S., Sheridan, H.: Expertise effects on attention and eye-movement control during visual search: evidence from the domain of music reading. Atten. Percept. Psychophys. 82(5), 2201–2208 (2020)
Wang, Z., Zheng, R., Kaizuka, T., Nakano, K.: Relationship between gaze behavior and steering performance for driver–automation shared control: a driving simulator study. IEEE Trans. Intell. Vehicles 4(1), 154–166 (2019)
Kober, H., Wager, T.D.: Meta-analysis of neuroimaging data. WIREs Cogn. Sci. 1(2), 293–300 (2010)
Gene-Sampedro, A., Alonso, F., Sánchez-Ramos, C., et al.: Comparing oculomotor efficiency and visual attention between drivers and non-drivers through the Adult Developmental Eye Movement (ADEM) test: a visual-verbal test. PLoS ONE 16(2), 0246606 (2021)
Roux-Sibilon, A., Trouilloud, A., Kauffmann, L., et al.: Influence of peripheral vision on object categorization in central vision. J. Vis. 19(14), 7 (2019)
Vater, C., Williams, A.M., Hossner, E.-J.: What do we see out of the corner of our eye? The role of visual pivots and gaze anchors in sport. Int. Revi. Sport Exerc. Psychol. 13(1), 81–103 (2019)
Wolfe, B., Dobres, J., Rosenholtz, R., et al.: More than the useful field: considering peripheral vision in driving. Appl. Ergon. 65, 316–325 (2017)
Marple-Horvat, D.E., Chattington, M., Anglesea, M., et al.: Prevention of coordinated eye movements and steering impairs driving performance. Exp. Brain Res. 163(4), 411–420 (2005)
Bright, L.: Supernumerary robotic limbs for human augmentation in overhead assembly tasks. Robot. Sci. Syst. (2017)
Suzuki, M., Izawa, A., Takahashi, K., et al.: The coordination of eye, head, and arm movements during rapid gaze orienting and arm pointing. Exp. Brain Res. 184(4), 579–585 (2008)
Nakabayashi, K., Iwasaki, Y., Iwata, H.: Development of evaluation indexes for human-centered design of a wearable robot arm. In: International Conference, pp. 305–310 (2017)
Guggenheim, J., Hoffman, R., Song, H., et al.: Leveraging the human operator in the design and control of supernumerary robotic limbs. IEEE Robot. Autom. Lett. 5(2), 2177–2184 (2020)
Acknowledgement
This work was supported in part by the National Natural Science Foundation of China (51905375), the China Post-doctoral Science Foundation Funded Project (2019M651033), Foundation of State Key Laboratory of Robotics and System (HIT) (SKLRS-2019-KF-06), and Peiyang Elite Scholar Program of Tianjin University (2020XRG-0023).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, Y., Zhang, W., Zeng, B., Zhang, K., Cheng, Q., Ming, D. (2021). The Analysis of Concurrent-Task Operation Ability: Peripheral-Visual-Guided Grasp Performance Under the Gaze. In: Liu, XJ., Nie, Z., Yu, J., Xie, F., Song, R. (eds) Intelligent Robotics and Applications. ICIRA 2021. Lecture Notes in Computer Science(), vol 13014. Springer, Cham. https://doi.org/10.1007/978-3-030-89098-8_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-89098-8_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89097-1
Online ISBN: 978-3-030-89098-8
eBook Packages: Computer ScienceComputer Science (R0)