Loading [a11y]/accessibility-menu.js
Unsupervised Cross-Modality Domain Adaptation Network for X-Ray to CT Registration | IEEE Journals & Magazine | IEEE Xplore

Unsupervised Cross-Modality Domain Adaptation Network for X-Ray to CT Registration


Abstract:

2D/3D registration that achieves high accuracy and real-time computation is one of the enabling technologies for radiotherapy and image-guided surgeries. Recently, the Co...Show More

Abstract:

2D/3D registration that achieves high accuracy and real-time computation is one of the enabling technologies for radiotherapy and image-guided surgeries. Recently, the Convolutional Neural Network (CNN) has been explored to significantly improve the accuracy and efficiency of 2D/3D registration. A pair of intraoperative 2-D x-ray images and synthetic data from pre-operative volume are often required to model the nonconvex mappings between registration parameters and image residual. However, a large clinical dataset collection with accurate poses for x-ray images can be very challenging or even impractical, while exclusive training on synthetic data can frequently cause performance degradation when tested on x-rays. Thus, we propose to train a model on source domain (i.e., synthetic data) to build appearance-pose relationship first and then use an unsupervised cross-modality domain adaptation network (UCMDAN) to adapt the model to target domain (i.e., X-rays) through adversarial learning. We propose to narrow the significant domain gap by alignment in both pixel and feature space. In particular, the image appearance transformation and domain-invariance feature learning by multiple aspects are conducted synergistically. Extensive experiments on CT and CBCT dataset show that the proposed UCMDAN outperforms the existing state-of-the-art domain adaptation approaches.
Published in: IEEE Journal of Biomedical and Health Informatics ( Volume: 26, Issue: 6, June 2022)
Page(s): 2637 - 2647
Date of Publication: 16 December 2021

ISSN Information:

PubMed ID: 34914602

Funding Agency:


References

References is not available for this document.