Loading [a11y]/accessibility-menu.js
Pixel-Level Collision-Free Grasp Prediction Network for Medical Test Tube Sorting on Cluttered Trays | IEEE Journals & Magazine | IEEE Xplore

Pixel-Level Collision-Free Grasp Prediction Network for Medical Test Tube Sorting on Cluttered Trays


Abstract:

Robotic sorting shows a promising aspect for future developments in medical field. However, vision-based grasp detection of medical devices is usually in unstructured or ...Show More

Abstract:

Robotic sorting shows a promising aspect for future developments in medical field. However, vision-based grasp detection of medical devices is usually in unstructured or cluttered environments, which raises major challenges for the development of robotic sorting systems. In this letter, a pixel-level grasp detection method is proposed to predict the optimal collision-free grasp configuration on RGB images. First, an Adaptive Grasp Flex Classify (AGFC) model is introduced to add category attributes to distinguish test tube arrangements in complex scenarios. Then, we propose an end-to-end trainable CNN-based architecture, which delivers high quality results for grasp detection and avoids the confusion in neural network learning, to generate the AGFC-model. Utilizing this, we design a Residual Efficient Atrous Spatial Pyramid (REASP) block to further increase the accuracy of grasp detection. Finally, a collision-free manipulation policy is designed to guide the robot to grasp. Experiments on various scenarios are implemented to illustrate the robustness and the effectiveness of our approach, and a robotic grasping platform is constructed to evaluate its application performance. Overall, the developed robotic sorting system achieves a success rate of 95% on test tube sorting.
Published in: IEEE Robotics and Automation Letters ( Volume: 8, Issue: 12, December 2023)
Page(s): 7897 - 7904
Date of Publication: 09 October 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.