Self-Distilled Hierarchical Network for Unsupervised Deformable Image Registration | IEEE Journals & Magazine | IEEE Xplore

Self-Distilled Hierarchical Network for Unsupervised Deformable Image Registration


Abstract:

Unsupervised deformable image registration benefits from progressive network structures such as Pyramid and Cascade. However, existing progressive networks only consider ...Show More

Abstract:

Unsupervised deformable image registration benefits from progressive network structures such as Pyramid and Cascade. However, existing progressive networks only consider the single-scale deformation field in each level or stage and ignore the long-term connection across non-adjacent levels or stages. In this paper, we present a novel unsupervised learning approach named Self-Distilled Hierarchical Network (SDHNet). By decomposing the registration procedure into several iterations, SDHNet generates hierarchical deformation fields (HDFs) simultaneously in each iteration and connects different iterations utilizing the learned hidden state. Specifically, hierarchical features are extracted to generate HDFs through several parallel gated recurrent units, and HDFs are then fused adaptively conditioned on themselves as well as contextual features from the input image. Furthermore, different from common unsupervised methods that only apply similarity loss and regularization loss, SDHNet introduces a novel self-deformation distillation scheme. This scheme distills the final deformation field as the teacher guidance, which adds constraints for intermediate deformation fields on deformation-value and deformation-gradient spaces respectively. Experiments on five benchmark datasets, including brain MRI and liver CT, demonstrate the superior performance of SDHNet over state-of-the-art methods with a faster inference speed and a smaller GPU memory. Code is available at https://github.com/Blcony/SDHNet.
Published in: IEEE Transactions on Medical Imaging ( Volume: 42, Issue: 8, August 2023)
Page(s): 2162 - 2175
Date of Publication: 13 February 2023

ISSN Information:

PubMed ID: 37022910

Funding Agency:


References

References is not available for this document.