TY - JOUR T1 - Motion Correction for Simultaneous PET/MR Brain Imaging Using a Radiofrequency-Penetrable PET Insert. JF - Journal of Nuclear Medicine JO - J Nucl Med SP - 367 LP - 367 VL - 61 IS - supplement 1 AU - Jonathan Fisher AU - Andrew Groll AU - Craig Levin Y1 - 2020/05/01 UR - http://jnm.snmjournals.org/content/61/supplement_1/367.abstract N2 - 367Introduction: Subject motion during PET/MR scans reduces image quality and accuracy. In this paper, a motion correction (MC) solution, and experimental results, demonstrating the solution’s performance on a three-point-sources motion phantom, are presented. Our camera-based solution tracks the motion of a marker attached to an object to be imaged, translates the motion measured by the camera to motion in the scanner coordinate system, and incorporates that motion data into the image reconstruction algorithm. The experimental data was acquired by the radiofrequency-penetrable brain dedicated PET insert for simultaneous PET/MR [1-3]. Methods: MR Compatible PET insert - The RF-penetrable PET system comprises a ring of 16 detector modules. These modules employ arrays of 3.2 x 3.2 x 20 mm3 LYSO crystal elements 1-1 coupled to arrays of silicon photomultipliers (SiPM). Each module has 128 crystals which provides a 3 cm crystal axial FOV for this prototype system. The RF-penetrable PET ring has a 32-cm internal diameter (an addition phased array RF receive coil reduces the effective diameter to 28 cm) design to be inserted into a 3T MR system - fig.1. Camera-based rigid body motion tracking - A 2.0 Megapixel web camera with 1/2.7" CMOS OV2710 image sensor. The camera was located 9-11 cm away from the marker, outside the bore. The camera-based motion tracking approach includes an initial camera calibration step (to correct for the camera’s distortion). Then, for every frame, the marker’s pose, Tmc (rotation and translation) was estimated (with respect to the camera’s coordinate system, as suggested in [4-6] and implemented using “camera calibration and 3D vision” MATLAB toolbox). Knowing the poses at times 0 and t, the relative change in pose estimation is defined as [7]: Tmtmo = Tcmo Tmtc = (Tmoc )-1Tmtc. Camera-to-scanner Cross-calibration - Knowing the camera-to-scanner coordinate systems transformation, it was possible to translate the motion being recorded in the camera coordinate system to motion in the scanner coordinate system. The Tsai-Lenz hand-eye calibration [8] was performed to do so. A 2D checkerboard marker and three point sources (500, 500, and 100 μCi) were mounted to a 3D printed a motion sphere - fig.2. The sphere’s pose was changed ten times. For each pose, a picture of the marker was taken, and 10 minutes of PET data was acquired. The data were energy windowed from 410 - 610 keV with a 10 ns time window. Images were generated using GPU accelerated 3D-OSEM reconstruction with a total of 5 iterations with one sub iteration per iteration [10]. System sensitivity normalization was assessed separately from an annulus-based normalization study which was acquired over 7 hours with an initial activity of 3.52 mCi of FDG. The 10 samples were randomly divided, 6 samples were used to estimate the cross-calibration and the rest of the samples were used to evaluate the solution’s performance. MAF reconstructed image - The multiple acquisition frames (MAF) correction method [9] for MC was implemented on the 4 remaining images. Knowing the relative motion between frames, and the Camera-to-scanner Cross-calibration, we knew how to correct for motion between frames and align the reconstructed images. Results: Reconstructed images of a single frame and multiple frames with and without MC are presented in fig.3-5. The MC image (fig. 5.) is almost identical to the single frame image (fig. 3.). Conclusions: This study demonstrated the ability of our camera-based MC solution to compensate for phantom motion. The same approach can be generalized and used on more complicated phantoms and on patients. Acknowledgment: This work was supported in part by NIH grants 3R01EB01946504 and 3R01EB01946504S1. ER -