Skip to main content

Main menu

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI

User menu

  • Subscribe
  • My alerts
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Nuclear Medicine
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI
  • Subscribe
  • My alerts
  • Log in
  • My Cart
Journal of Nuclear Medicine

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • View or Listen to JNM Podcast
  • Visit JNM on Facebook
  • Join JNM on LinkedIn
  • Follow JNM on Twitter
  • Subscribe to our RSS feeds
Meeting ReportImage Generation

Unsupervised learning to perform respiratory motion correction in PET imaging

Teaghan O'Briain, Carlos Uribe, Ioannis Sechopoulos, Kwang Moo Yi, Jonas Teuwen and Magdalena Bazalova-Carter
Journal of Nuclear Medicine August 2022, 63 (supplement 2) 2401;
Teaghan O'Briain
1University of Victoria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Carlos Uribe
2BC Cancer
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ioannis Sechopoulos
3Radboud University Medical Centre
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Kwang Moo Yi
4University of British Columbia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jonas Teuwen
3Radboud University Medical Centre
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Magdalena Bazalova-Carter
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
Loading

Abstract

2401

Introduction: Due to the long acquisition times required for positron emission tomography (PET) imaging, respiratory motion can have a substantial effect on the quality of the resulting images. This is especially true when evaluating tumors that are located in areas that are more prone to movement, such as those close to the diaphragm. To correct for this issue, coincidences detected only during a particular phase of the breath (e.g. the quiescent phase) are typically used to generate gated images. However, by only using a small portion of the data collected during the entire imaging process, the number of detections in the gated image compared to a non-gated one is lower and the images are noisier. To solve this problem, one either has to increase the scan time or inject a higher activity of the radiopharmaceutical. Both solutions are not ideal, as longer scan times decrease patient comfort and throughput, while increasing the activity subjects the patient to more harm from the additional radiation. Another method currently used in the literature is to perform a registration between the different phases of the breathing motion using mutual information, which utilizes all of the detections acquired during the scan. In this project we aim to explore an unsupervised machine learning approach to perform these registrations.

Methods: A training dataset of 270 patients was generated using the 4D extended cardiac-torso (XCAT) anthropomorphic digital phantom with (2×4×4) mm3 voxels, exhibiting varying inhalation levels for different anatomies as well as a variety of other patient parameters. Training the machine learning network was accomplished by providing the model with two PET frames from different breathing phases. The network then predicted the pixel-wise shift in 3D from one frame to the other. By comparing the shifted frame to the target frame, the model learned to produce more accurate shifts, making the learning process unsupervised. As a result, the trained model is able to receive different gated PET images and groups them all into a motion-corrected single bin; providing a final image without the blurring effects that were initially observed. By utilizing the XCAT phantom, accessing the ground-truth motion corrections was trivial, making the validation of the network performance on similar data robust and straight-forward. For a variety of different breathing amplitudes, improvements in tumor recovery were quantified by computing the intersection over union of the tumors as well as the enclosed activity before and after the corrections were applied. Furthermore, the pixel-wise differences in the shifts between the predictions and the ground-truths were calculated.

Results: Comparing the uncorrected and corrected image to the ground truth distribution, the network was found to provide relative improvements of 73% and 81% for the intersection over union and total activity, respectively, when correcting for a diaphragm shift of 21 mm. Improvements in tumor recovery were found to be consistent across the tested range of breathing amplitudes. For a diaphragm shift of 21 mm, the median absolute residual error in the flow predictions was less than 1.5 mm, which is smaller than the smallest voxel dimension of the phantom.

Conclusions: Our results suggest that our proposed method has the potential to accurately correct for breathing motion artifacts in PET. Furthermore, this method is unsupervised, avoiding the need for human intervention when transferring the method to the clinical setting.

Figure
  • Download figure
  • Open in new tab
  • Download powerpoint
Previous
Back to top

In this issue

Journal of Nuclear Medicine
Vol. 63, Issue supplement 2
August 1, 2022
  • Table of Contents
  • Index by author
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Nuclear Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Unsupervised learning to perform respiratory motion correction in PET imaging
(Your Name) has sent you a message from Journal of Nuclear Medicine
(Your Name) thought you would like to see the Journal of Nuclear Medicine web site.
Citation Tools
Unsupervised learning to perform respiratory motion correction in PET imaging
Teaghan O'Briain, Carlos Uribe, Ioannis Sechopoulos, Kwang Moo Yi, Jonas Teuwen, Magdalena Bazalova-Carter
Journal of Nuclear Medicine Aug 2022, 63 (supplement 2) 2401;

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Unsupervised learning to perform respiratory motion correction in PET imaging
Teaghan O'Briain, Carlos Uribe, Ioannis Sechopoulos, Kwang Moo Yi, Jonas Teuwen, Magdalena Bazalova-Carter
Journal of Nuclear Medicine Aug 2022, 63 (supplement 2) 2401;
Twitter logo Facebook logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
  • Figures & Data
  • Info & Metrics

Related Articles

  • No related articles found.
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Positron Range Correction Improves Rubidium-82 PET Myocardial Perfusion Image Quality
  • SubtlePET Analysis Using American College of Radiology Phantom and Guidelines
  • Unsupervised deep learning strategies for noise reduction in 18F-FDG brain PET imaging
Show more Image Generation

Similar Articles

SNMMI

© 2025 SNMMI

Powered by HighWire