DFSNE-Net: Deviant feature sensitive noise estimate network for low-dose CT denoising☆
Introduction
Computed tomography (CT) is a common examination technique in recently clinical medical field, using X-ray, gamma rays, ultrasound and other related means, in coordination with a highly sensitive detector to perform cross-sectional scans around a part of the body or the whole body, with extremely fast scanning and imaging speed, which can be used for the examination of a variety of diseases. However, the quality of CT can be seriously disturbed and affected by factors such as the CT scanning machine, the CT scanning environment and the patient’s own physical condition.
Although CT provides critical clinical information, radiation poses a potential risk during the scanning process [1], [2]. Over the last decade, radiation dose selection in CT has shown a decreasing trend, for example, typical dose levels for coronary CT angiography have decreased from approximately 12mSv in 2009 [3] to 1.5mSv in 2014 [4], but the reduction in radiation dose in CT examinations inevitably leads to local deviations in HU values [5] with noise and artifacts. Therefore, in order to reduce the damage to the patient’s body and to make more accurate judgments about the patient’s physical condition, improve the image quality, display the image feature more clearly, and make the CT have a better view, the denoising of low-dose CT has gradually become an important issue in the field of CT and has gradually received more academic attention.
For low-dose CT denoising means are divided into the following three main types: Image filtering, iterative reconstruction and image post-processing(both traditional imaging process methods and deep learning methods).
(1) In image filtering algorithms, sinogram processing is usually performed on CT data before image reconstruction. Manduca et al. [6] proposed bilateral filtering combined with a CT noise model, which achieves a better noise resolution tradeoff compared to conventional reconstruction algorithms. Some other typical methods likewise exist including structural adaptive filtering [7] and penalized weighted least squares algorithms [8]. Li et al. [9] proposed a statistical model for sinogram data and developed a penalized likelihood method to suppress quantum noise. Image filtering algorithms are often limited in practical applications due to the difficulty in obtaining projection data.
(2) In iterative reconstruction algorithms, incremental estimation of prior knowledge from the image domain is usually used for denoising. For example, the total variation(TV) [10], [11], [12], [13] incorporates prior knowledge about data noise and image content into the objective function to optimally reconstruct the laminar image, which greatly improves the denoising performance. Similar nonlocal mean methods [14], [15], [16], dictionary learning methods [17], [18] and some similar other techniques [19], [20], [21] have also been formulated. However, a large amount of computational resources are used on the iterative projection/inverse projection step, which also requires a long processing time on specialized hardware and the availability of CT projection data.
(3) In image post-processing algorithm, the reconstructed CT are usually post-processed directly, instead of directly processing the original projection data. Initially, researchers tried classical image processing algorithms, similar to nonlocal mean filtering [22], [23], [24], dictionary learning-based methods [25], [26], block matching algorithms [27], [28], [29], and diffusion filters [30], which are much more computationally efficient than iterative reconstruction algorithms. However, the noise with in the reconstructed images after low-dose CT is often unevenly distributed, and processing by these algorithms does not solve the problem of low-dose CT denoising, so it is not widely used.
(4) In image post-processing algorithm, the use of deep neural networks for low-dose CT denoising has gradually become an important approach in recent years. Convolutional Neural Network (CNN) is a class of deep neural networks with nonlinear and convolutional filters that have proven to be very effective in a variety of tasks such as image classification, denoising [31], [32], [33], and super-resolution [34]. Early research in denoising focused on CNN structure optimization and adaption, such as RED-CNN [35] and wavelet networks [36]. However, the use of Root Mean Square Error Loss in these methods can produce excessively smooth images for denoised images, Johnson et al. proposed a perceptual loss based on a pre-trained VGG model [37], [38] to solve this problem. Generative adversarial network (GAN) [39], [40] was subsequently introduced to implement low-dose CT denoising, which transformed the image denoising problem into a high-quality CT generation problem, reducing the pixel-level regression loss [41], [42], [43], [44] and improving the quality of the denoised images.
The rest of this paper is organized as follows. Part 2 realizes the deviant analysis of noise through mathematical model. Part 3 introduces the key methods used in our proposed denoising model. Part 4 reports the experiment details and results. Part 5 discusses some related issues, gives the future development direction and make a conclusion.
Section snippets
Deviant analysis of noise
In order to better define and utilize the estimate noise for the denoising task, we performed a statistical analysis of the distribution of HU value of low-dose CT(LDCT) and normal-dose CT(NDCT) in the public AAPM dataset [45], which is about the low-dose CT denoising challenge launched in 2016, specifically using the updated dataset in 2021. The data provider added Poisson noise for each case to match the noise level of a quarter-dose CT. Using the difference between LDCT and NDCT as the
Methods
The whole denoising process of CT is shown in Fig. 2, and the process is divided into three steps. (1)The CT passes through the trained deviant feature sensitive noise estimate network (DFSNE-Net) to get the estimated noise ; (2)The estimate noise can be normalized by the noise distribution normalization based on skewness and kurtosis (SK-NDN) to get the normalized estimate noise , and also used the low credible noise suppression based on confidence interval (CI-LCNS) to obtain the
Dataset
The AAPM [45] dataset is a moderately smooth kernel (B30 kernel) reconstruction, 120kv, 1 mm slice, and we use 10 cases containing quarter-dose and full-dose CT. Because of we also conducted denoising experiments on the train dataset, and the amount of the train dataset is enough to support the training task of the network, We took 5 cases as train dataset, and 5 cases as test dataset for the qualitative and quantitative evaluation of the denoising results. The slices from the test datadaet
Conclusion
In this paper, we assume noise is correlated with deviant features, and design a network structure DFSNE-Net that is sensitive to deviant features, in which more deviant features are perceived by MSC-DFPM, and then filtering and transmiss deviant features by SISA-FM, and propose the corresponding statistical-based noise optimization method. After the validation of the LDCT denoising experiments, DFSNE-Net we proposed can indeed estimate more accurate noise and avoid the generation of fake
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References (48)
- et al.
Radiation exposure from CT scans in childhood and subsequent risk of leukaemia and brain tumours: a retrospective cohort study
Lancet
(2012) - et al.
Bayesian statistical reconstruction for low-dose X-ray computed tomography using an adaptive-weighting nonlocal prior
Comput. Med. Imaging Graph.
(2009) - et al.
Computed tomography—an increasing source of radiation exposure
N. Engl. J. Med.
(2007) - et al.
Estimated radiation dose associated with cardiac CT angiography
JAMA
(2009) - et al.
Routine low-radiation-dose coronary computed tomography angiography
Eur. Heart J. Suppl.
(2014) - et al.
Local noise estimation in low-dose chest CT images
Int. J. Comput. Assist. Radiol. Surg.
(2014) - et al.
Projection space denoising with bilateral filtering and CT noise modeling for dose reduction in CT
Med. Phys.
(2009) - et al.
Ray contribution masks for structure adaptive sinogram filtering
IEEE Trans. Med. Imaging
(2012) - et al.
Penalized weighted least-squares approach to sinogram noise reduction and image reconstruction for low-dose X-ray computed tomography
IEEE Trans. Med. Imaging
(2006) - et al.
Nonlinear sinogram smoothing for low-dose X-ray CT
IEEE Trans. Nucl. Sci.
(2004)
Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization
Phys. Med. Biol.
Low-dose CT reconstruction via edge-preserving total variation regularization
Phys. Med. Biol.
Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction
Phys. Med. Biol.
Few-view image reconstruction with fractional-order total variation
J. Opt. Soc. Amer. A
Iterative image reconstruction for cerebral perfusion CT using a pre-contrast scan induced edge-preserving prior
Phys. Med. Biol.
Spectral CT reconstruction with image sparsity and spectral mean
IEEE Trans. Comput. Imaging
Low-dose X-ray CT reconstruction via dictionary learning
IEEE Trans. Med. Imaging
Tensor-based dictionary learning for spectral CT reconstruction
IEEE Trans. Med. Imaging
EM+ TV based reconstruction for cone-beam CT with reduced radiation
Learned primal–dual reconstruction
IEEE Trans. Med. Imaging
A deep learning architecture for limited-angle computed tomography reconstruction
Low-dose computed tomography image restoration using previous normal-dose scan
Med. Phys.
Adaptive nonlocal means filtering based on local noise level for CT denoising
Med. Phys.
Optimizing non-local means for denoising low dose CT
Cited by (0)
- ☆
This research was supported by Natural Science Foundation of Liaoning Province, China (No. 2021-YGJC-07).