Abstract
454
Objectives: PET images are severely contaminated by Poisson noise due to low sensitivity of PET scanner and limited acquisition time in clinical use. Penalized or regularized image reconstruction is an effective way to suppress the noise by incorporating a smooth penalty in the reconstruction process. The strength of the smooth penalty is controlled by the hyper-parameter. Inappropriate selection of hyper-parameter may overestimate or under-estimate the noise, and results in over-smoothed or under-convergent image. The purpose of this paper is to evaluate the impact of sensitivity map and noise equivalent counts on hyper-parameter selection for regularized reconstruction.
Methods: OSEM with time of flight (TOF) and point spread function (PSF) techniques is considered as standard reconstruction algorithm. A smooth penalty which is the total variation of the image is added into the standard reconstruction algorithm to form a regularized reconstruction algorithm. The hyper parameter is modeled as the product of the step size, a global factor and a local factor. The step size is manual selected; empirically, an optimized step size could be obtained through phantom studies. The global factor is a constant determined by the noise equivalent counts (NEC). It evaluates the quality of the data collected under varied activity and acquisition time. The local factor estimates the noise attributed to the spatially variant sensitivity as well as the attenuation effect. It is voxel dependent. More weight is given to the voxel with lower sensitivity and higher attenuation. Two phantom studies were performed on a clinical PET/CT scanner (uMI 780, United Imaging Healthcare) to verify the effectiveness of the proposed method. A cylinder phantom filled with uniformly distributed 68Ge was placed at the center of transverse field of view (FOV) and scanned for 30mins. The length of the phantom is larger than 30cm so as to measure the noise distribution over the entire axial FOV. A NEMA body phantom was filled with 0.14 uCi/cc 18F-FDG, and the concentration of the hot sphere was 4 times that of the background. It was scanned for 15mins, and then the Listmode data were split into 20 sets to simulate different NECs. Contrast recovery coefficients (CRs) of hot spheres and background variabilities were plotted as functions of iterations. At last, the proposed method was applied to the clinical data to verify its merit in clinical practice.
Results: From the reconstructed images of the cylinder phantom, we find that the axial noise distribution is more uniform when the sensitivity map is incorporated into the hyper-parameter. For NEMA body phantom, images from NEC dependent regularized reconstruction have similar background noise even the data acquisition time varies from 1 minute to 15 minutes. In the case of 1 minute scan, the noise decreases by 77.8% and 52.3% compared to the standard reconstruction and the NEC independent regularized reconstruction. Further analysis shows that the background variabilities become stable after first few iterations, meanwhile the CRs are continuously increasing. Therefore more iterations could be run until the CRs are convergent.
Conclusions: The phantom and clinical studies have demonstrated that incorporating the sensitivity map and NEC into the hyper-parameter is an effective way to improve the regularized reconstruction's robustness against noise related to spatially variant sensitivity, acquisition time and activity. The image is convergent after adequate iterations, thus achieving a more accurate quantification for PET imaging.