Slim UNETR: Scale Hybrid Transformers to Efficient 3D Medical Image Segmentation Under Limited Computational Resources | IEEE Journals & Magazine | IEEE Xplore

Slim UNETR: Scale Hybrid Transformers to Efficient 3D Medical Image Segmentation Under Limited Computational Resources


Abstract:

Hybrid transformer-based segmentation approaches have shown great promise in medical image analysis. However, they typically require considerable computational power and ...Show More

Abstract:

Hybrid transformer-based segmentation approaches have shown great promise in medical image analysis. However, they typically require considerable computational power and resources during both training and inference stages, posing a challenge for resource-limited medical applications common in the field. To address this issue, we present an innovative framework called Slim UNETR, designed to achieve a balance between accuracy and efficiency by leveraging the advantages of both convolutional neural networks and transformers. Our method features the Slim UNETR Block as a core component, which effectively enables information exchange through self-attention mechanism decomposition and cost-effective representation aggregation. Additionally, we utilize the throughput metric as an efficiency indicator to provide feedback on model resource consumption. Our experiments demonstrate that Slim UNETR outperforms state-of-the-art models in terms of accuracy, model size, and efficiency when deployed on resource-constrained devices. Remarkably, Slim UNETR achieves 92.44% dice accuracy on BraTS2021 while being 34.6x smaller and 13.4x faster during inference compared to Swin UNETR. Code: https://github.com/aigzhusmart/Slim-UNETR
Published in: IEEE Transactions on Medical Imaging ( Volume: 43, Issue: 3, March 2024)
Page(s): 994 - 1005
Date of Publication: 20 October 2023

ISSN Information:

PubMed ID: 37862274

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.