Intelligent grasping with natural human-robot interaction
ISSN: 0143-991X
Article publication date: 5 December 2017
Issue publication date: 2 January 2018
Abstract
Purpose
The aim of this paper is to propose a grasping method based on intelligent perception for implementing a grasp task with human conduct.
Design/methodology/approach
First, the authors leverage Kinect to collect the environment information including both image and voice. The target object is located and segmented by gesture recognition and speech analysis and finally grasped through path teaching. To obtain the posture of the human gesture accurately, the authors use the Kalman filtering (KF) algorithm to calibrate the posture use the Gaussian mixture model (GMM) for human motion modeling, and then use Gaussian mixed regression (GMR) to predict human motion posture.
Findings
In the point-cloud information, many of which are useless, the authors combined human’s gesture to remove irrelevant objects in the environment as much as possible, which can help to reduce the computation while dividing and recognizing objects; at the same time to reduce the computation, the authors used the sampling algorithm based on the voxel grid.
Originality/value
The authors used the down-sampling algorithm, kd-tree algorithm and viewpoint feature histogram algorithm to remove the impact of unrelated objects and to get a better grasp of the state.
Keywords
Acknowledgements
The project was funded by “Guangdong Natural Science Funds for Distinguished Young Scholar (2017A030306015)”, “Pearl River S&T Nova Program of Guangzhou (201710010059)”, “Guangdong special projects (2016TQ03X824)”, “The Fundamental Research Funds for the Central Universities (NO:2017JQ009)”, “National Natural Science Foundation of China (No:61403145)” and “Guangzhou science and technology project (201604046029)”.
Citation
Zhou, Y., Chen, M., Du, G., Zhang, P. and Liu, X. (2018), "Intelligent grasping with natural human-robot interaction", Industrial Robot, Vol. 45 No. 1, pp. 44-53. https://doi.org/10.1108/IR-05-2017-0089
Publisher
:Emerald Publishing Limited
Copyright © 2018, Emerald Publishing Limited