Abstract:
Since parathyroid glands (PG) regulate the body’s calcium levels and significantly impact human health, developing methods for their automatic detection during endoscopic...Show MoreMetadata
Abstract:
Since parathyroid glands (PG) regulate the body’s calcium levels and significantly impact human health, developing methods for their automatic detection during endoscopic thyroid surgery is of utmost clinical significance. However, existing parathyroid detection works suffer from color variations, target deformation, blur, and lighting effects in intricate surgical environments. To address the above shortcomings, in this paper, we propose a novel double-layer graph attention network for PG detection, which explicitly facilitates local augmentation via key visual features (e.g., texture and shape) identification and global interactions. It can robustly combat image blur and better differentiate the PG targets and background parts, thus improving the detection precision. Furthermore, we observe most prior works fail to deeply understand the spatial relation among targets and unavoidably suffer from false or missed detection, which is heavily due to total ignorance or insufficient utilization of depth information, especially under lighting variations and occlusions. To fill the gap, we propose a depth relation augmentation component to adaptively capture the prominent relative positional relations between targets based on depth information and incorporate it into the proposed GNN framework, significantly deepening spatial understandings and naturally enhancing generalizability. Due to lacking a thyroid endoscopy surgery benchmark for evaluating this task, we meticulously established a novel dataset from 838 actual surgeries conducted (via the fully laparoscopic thoracic-breast approach) at the Fujian Medical University Union Hospital. Extensive experiments show that our framework achieves superior PG detection accuracy compared to current state-of-the-art counterparts while keeping real-time efficiency.
Date of Conference: 03-06 December 2024
Date Added to IEEE Xplore: 10 January 2025
ISBN Information: