Skip to main content

Main menu

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI

User menu

  • Subscribe
  • My alerts
  • Log in
  • Log out
  • My Cart

Search

  • Advanced search
Journal of Nuclear Medicine
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI
  • Subscribe
  • My alerts
  • Log in
  • Log out
  • My Cart
Journal of Nuclear Medicine

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • View or Listen to JNM Podcast
  • Visit JNM on Facebook
  • Join JNM on LinkedIn
  • Follow JNM on Twitter
  • Subscribe to our RSS feeds
Research ArticlePET/CT: Imaging Function and Structure

Software Approach to Merging Molecular with Anatomic Information

Piotr J. Slomka
Journal of Nuclear Medicine January 2004, 45 (1 suppl) 36S-45S;
Piotr J. Slomka
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Abstract

Software image registration is a powerful and versatile tool that allows the fusion of molecular and anatomic information. Image registration can be applied to compare anatomic information with function, localize organs and lesions, and plan radiation therapy, biopsy, or surgery. Automatic volume-based image registration techniques have been devised for both linear and nonlinear image alignment. Challenges remain in the validation of the accuracy of software registration. Image registration has been applied clinically in neurology and oncology and may be particularly practical in radiotherapy applications. Potential new applications in cardiology could allow the combination of CT angiography with perfusion and viability images obtained by PET, SPECT, or MRI. Software methods allow versatility in the choice of modalities and facilitate retrospective and selective application. Fully automatic registration algorithms are needed for routine clinical applications. Connectivity, compatibility, and cooperation between various clinical departments are essential for the successful application of software-based image fusion in a hospital setting.

  • image registration
  • image fusion
  • PET/CT
  • nonlinear registration
  • mutual information

Fusion of images containing molecular and anatomic information could aid clinicians in a variety of clinical applications, including comparison of anatomic information with function, localization and boundary definition of organs and lesions, planning of radiation therapy and biopsy, and integration of PET or other functional modalities with image-guided surgery.

Merging of multimodality images requires accurate image alignment, which is typically referred to as image registration. Such image registration can be accomplished by software algorithms that retrospectively align 3-dimensional (3D) data acquired by stand-alone modalities to common spatial coordinates. Practical systems using software registration algorithms for image fusion have been commercialized and are used clinically in several centers. In this article, issues related to the practical implementation of software for merging anatomic and functional information are discussed, including: (a) description of computer algorithms for automatic retrospective image registration; (b) validation of accuracy for such algorithms; (c) visualization techniques for display of fused images; (d) clinical applications; and (e) comparison with hardware PET/CT technology.

IMAGE REGISTRATION ALGORITHMS

Several approaches have been proposed for the retrospective automatic registration of multimodal images. Broadly, these registration algorithms could be grouped as feature based (using extracted image features) and volume based (using statistical voxel dependencies). The algorithms also could be classified as linear, when computed alignment transformation between 3D image volumes is limited to translation, rotation, and possibly scaling, or nonlinear, which allow more complex transformations. Nonlinear techniques can be feature based or volume based.

Feature-Based Algorithms

Feature-based registration algorithms seek to align corresponding anatomic landmarks, organ surfaces, or other features. Such techniques consist of 2 steps: (a) extraction of relevant features (points, contours, surfaces) from the images; and (b) spatial alignment of these features. Two representative examples of the feature-based approach are the “head and hat” method (1) and the “iterative closest point” method (2). Accurate image segmentation is required because the registration relies on only the extracted features. Therefore, errors in image segmentation will inevitably lead to errors in image registration. Although the registration of accurately segmented surfaces is computationally straightforward, the identification of such surfaces may be difficult and prone to errors. This step may require significant user interaction, even in the case of brain registration. Automatic extraction of features needs to be customized for each imaging modality and for each organ of interest. Some registration techniques rely on external fiducial markers that can be identified and matched (automatically or manually) on the acquired images (3,4). The main limitations of such marker-based approaches are the increased complexity of the imaging procedures and the lack of information about internal organ displacements.

Volume-Based Algorithms

More recently, volume-based image registration techniques have been introduced to maximize measures of similarity (cost function) between images. The proposed measures of the alignment quality include the standard deviation of the histogram (5), joint entropy (6), and mutual information (MI) (7,8). In particular, methods implementing MI measures (Fig. 1) have been proven versatile and successful in clinical applications (9). Volume-based techniques usually do not depend on image segmentation but exploit the statistical voxel dependencies of the raw image pairs to find the appropriate alignment. These techniques were initially designed for the registration of MRI, CT, and PET images of the brain (5,7,8) but recently have been extended to other organs (10–12). Volume-based techniques have been shown to achieve better accuracy than surface-based methods (13,14).

FIGURE 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 1.

Concept of image registration based on mutual information (MI) is explained using example of PET and CT. Separate PET and CT image intensity histograms are derived from PET and CT, which contain frequencies (f) of occurrence for specific voxel values in 3D volumes (p = PET, c = CT). Additional 2D image histogram is created from combination of PET and CT data, in which frequencies of occurrence for particular PET/CT voxel intensity pairs (p, c), both at same location, are calculated. Subsequently, PET and CT image entropies are calculated from PET and CT histograms, and 2D PET/CT histogram is used to calculate joint entropy. Joint entropy is smallest and, consequently, MI is largest when images are closely aligned and 2D histogram is least dispersed. Search is performed, which continuously modifies 3D shifts (X,Y,Z) and rotations (XY, XZ, YZ), each time transforming PET data. Although it is possible to perform image registration using joint entropy only, inclusion of separate PET and CT entropies is needed when portions of PET volume could move outside of overlapping field of view.

Several possible modifications to volume-based algorithms can improve their reliability and speed. For example, PET transmission maps acquired with emission data can be used in the calculation of the cost function, because these maps do not reflect the physiologic variations of the radioisotope uptake. The emission images, however, contain important information that can correlate with anatomic features on CT or MRI. Therefore, the registration of the combined emission and transmission data with CT is more reliable than the registration of emission or transmission data alone (11). It is also possible to adjust the image resolution “on the fly” during the iterative search for the best alignment. Initially, the stand-alone images can be grossly misaligned, and small matrix sizes are sufficient to search for approximate alignment. The matrix size can be progressively increased as the images become closely aligned, thus allowing fine adjustments. Such multiresolution techniques can decrease the time of the computations and may avoid entrapment in “local minima” (15). A method of search for the optimal value of the cost function may affect the calculation time or entrapments in local minima. Furthermore, the number of registration parameters will determine the time required to find a solution. For example, the search for a transformation that includes additional scaling parameters will take longer than rigid-body registration. Current practical implementations allow fully automated image registration of thoracic CT and PET in <1 min (10,11) with accuracy at <1 cm.

Nonlinear Registration

Image registration of thoracic and abdominal scans may require nonlinear transformation to compensate for changes in body configuration, breathing patterns, or movements of internal organs. Nonlinear image alignment (image warping) uses advanced interpolation schemes, such as the thin-plate spline method (16) or piecewise-linear methods (12) adapted to 3 dimensions. A major difficulty with nonlinear warping is the determination of the correct nonlinear transform from the functional/molecular images. The transmission images are acquired during the same tidal breathing as the emission images and, therefore, may provide an approximate reference for the nonlinear image transformation that is not affected by potential mismatches between the radiopharmaceutical uptake and the anatomy. Our group has recently proposed a fully automated warping algorithm for thoracic PET and CT, which compensates for nonlinear deformations between PET and inspiration CT scans (11). Nonlinear registration techniques must be automated, because the number of adjustable image parameters can be large (in the hundreds). An example of image warping is shown in Figure 2. This nonlinear registration has been performed by automatic identification of PET and CT landmarks in the lungs, diaphragm, and in body contours and subsequent accelerated warping. The total calculation time for such nonlinear registration is <5 s on a standard personal computer.

FIGURE 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 2.

Example of image warping. (A) Fusion of diagnostic CT acquired in inspiration with stand-alone PET using linear registration. Note gross misalignment of tumor and diaphragm. (B) Same CT images fused with emission PET images after automatic warping correction (11). Patient data acquired at Department of Nuclear Medicine/PET Center, Zentralklinik Bad Berka, Germany.

VALIDATION

Validation of retrospective image registration is difficult and is particularly challenging for nonlinear algorithms. The external fiducial markers placed on the patient’s skin could be used as a “gold standard,” but they may shift between 2 examinations. The accuracy of registration using external markers has been reported to be lower than that of volume-based registration in brain studies (13). A more definitive validation can be performed with stereotactic frames, a process that is highly invasive for the patient and only applicable in brain studies (17). Phantom studies are of limited value in the validation of registration because of the impact of variable patient anatomy on the performance of the algorithms (18). Moreover, no available phantoms simulate complex patient motion. The accuracy of the volume-based PET-to-CT image registration for the brain has been reported to be in the order of 2 mm or half the voxel size (17). Soft-tissue image registration can be validated by expert observers who can identify the locations of corresponding anatomic features. One intriguing possibility is to validate the stand-alone PET/CT image registration by misaligning and then re-registering the “gold-standard” data acquired on the PET/CT scanner. However, misregistration may also occur on the PET/CT scanner as a result of patient motion (19), and some software correction for such misregistration of the hybrid PET/CT data may be necessary. The reproducibility and consistency of the algorithms can be evaluated by repeating the registration procedure from multiple starting positions or by comparing the results to results obtained from other data acquired simultaneously, such as transmission maps and contrast images.

VISUALIZATION TECHNIQUES

Fusion Display

Anatomic images such as CT or MRI are displayed using gray-scale lookup tables, whereas functional information derived from PET and SPECT can be depicted with high contrast using color. With a 24-bit display, which is standard on most workstations, it is possible to perform color blending (or α blending). The corresponding intensities from both images are combined, and new color values are computed for each pixel, simulating the transparency effect. This calculation is often implemented in the graphics hardware of new video cards. One disadvantage of such fused displays is that bright structures on gray-scale images can artificially increase the perceived intensity on color images. Therefore, fused displays should be reviewed with caution, and a side-by-side review of the original image is always recommended. When using 8-bit displays, it is not possible to mix or combine color and gray-scale images for each pixel. Instead, pixels from both modalities can be interleaved. Such interleaved pixels appear as a superimposition of color with gray scale, with the degree of transparency proportional to the interleave step. Image information also can be lost in interleaved displays, because 1 or both sets may need to be subsampled.

Alternative Display Options

An effective multimodality display tool is a pair of synchronized cursors displayed simultaneously on both images. The advantages of such synchronized side-by-side displays are that no information is lost and features present at the same location on both images do not interfere with each other. Other techniques include a possibility to interactively reveal and rove a small image subwindow containing a subimage with the data from the opposite modality. Fused data can be visualized in 3 dimensions using a combination of all the above techniques. Three-dimensional volume-rendered images can be created from the anatomic images and superimposed with 2D orthogonal or oblique cut-planes to reveal the corresponding functional information. Three-dimensional displays also can include extracted surfaces with color-coded functional information. These methods have been applied in multimodal cardiac (20) and brain visualization (21). The multimodality displays can exploit the latest computer graphics hardware innovations, such as hardware-based 3D or 2D texture mapping, which can accelerate 3D navigation, image blending, and even image registration (22,23).

CLINICAL APPLICATIONS FOR SOFTWARE REGISTRATION

Although the initial registration methods have been proposed for brain applications, several new techniques have been developed for automated, retrospective registration of other organs, most notably for the registration of thoracic data. These new techniques allow linear or nonlinear alignment. Software registration has been applied in neurologic, oncologic, and cardiac applications for MRI, CT, PET, and SPECT, which demonstrates its versatility. The validation of such registration techniques for clinical applications remains particularly challenging, because each image type and organ requires separate evaluation of the accuracy. Up to now, the accuracy of brain registration has been extensively validated (17). Validation usually needs to be performed in a subjective fashion, and the results of different studies may not be easily comparable.

Neurology

Multimodality image registration, including PET or SPECT, has been used in several neurologic applications. One clinical example is the combination of MR and ictal SPECT in the imaging of epilepsy. Ictal SPECT images can be coregistered with MR to aid image-guided resection (24). Although SPECT discerns seizure foci, MRI information is needed for navigation and anatomic correlation during the surgical procedure. Another useful application for the combination of SPECT and MRI is the possibility of anatomically derived regions of interest. Image quantification can be improved in neuroreceptor studies when using correct anatomic borders of the basal ganglia (25) (Fig. 3). Image registration of brain PET with MRI has been studied extensively (5,17). Coregistered PET/MRI brain images could be used for partial volume correction of PET and for enhancing quantitative accuracy for measurements of cerebral blood flow, glucose metabolism, and neuroreceptor binding (26). PET-to-functional-MRI registration can have clinical applications in neurosurgery (27). MR images also can be helpful in anatomic localization of activation foci depicted by SPECT (28).

FIGURE 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 3.

Fusion of 123I-β-carbomethoxyiodophenyl tropane SPECT neuroreceptor images with MRI. MR images can be used to provide anatomic region of interest for basal ganglia, thus avoiding quantification errors resulting from blurred SPECT boundaries. Such anatomic information also potentially can be used for partial volume correction. Images courtesy of Henryk Barthel, MD, Leipzig University, Germany.

Oncology

The first applications of software registration were reported in brain oncology (1). Software registration has been used often in brain oncology, with various combinations of PET, CT, and MRI (12,29,30). Multimodality software registration of CT and PET of the thorax has been approached as a rigid-body problem using surface-based techniques (31), interactively defined homologous markers (32), and MI maximization with nonlinear adjustments (11,12). Automated registration of 111In-capromab pendetide (Prostascint; Cytogen Corp., Princeton, NJ) SPECT and CT pelvic images has been developed using vascular tree information (33). In Figure 4, the results of volume-based image registration using the MI algorithm for 111In-capromab pendetide SPECT and MRI are shown. In this particular application, MR images are preferred to CT because of the superior soft-tissue contrast (34). In Figure 5, stand-alone PET, CT, and MR images are coregistered with a similar algorithm to evaluate the response to chemotherapy.

FIGURE 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 4.

Two patients imaged with SPECT 111In-capromab pendetide images registered by software with MRI. Patient without involvement of neurovascular bundles (yellow arrows) imaged with MRI (A) and with MRI/SPECT fusion (B). Second patient with uptake in left neurovascular bundle imaged with MRI (C) and with MRI/SPECT fusion (D). Fused images aided planning of brachytherapy in these patients. Images courtesy of Dr. Samuel Kipper, MD, Medical Director, Pacific Coast Imaging, Irvine, CA.

FIGURE 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 5.

Staging and evaluation of chemotherapy response in non-Hodgkin’s lymphoma with software registration. 18F-FDG PET examinations defined 2 lesions: hypermetabolic substernal, indicating continuing active disease, in what had been decrease in size noted on CT, and an additional small focal area in right paratracheal location. Lesions are shown on maximum intensity projection view (A, left) and corresponding transaxial slices (A, right). With image coregistration, extent of patent’s disease is defined as involving an osseous lesion not appreciated on CT alone. (B) Subsequent MR evaluation and MRI/PET fusion confirmed PET osseous lesion (arrow). PET with MRI coregistration confirmed osseous involvement and location that dramatically changed therapeutic approach. Images courtesy of John Vansant, MD, Providence Medical Center, Portland, OR.

PET is an excellent tool in radiation therapy planning. Image registration has been applied to augment radiation CT-based treatment planning with PET or SPECT information. Treatment plans that incorporate SPECT or PET information have been performed for cancers of the brain (29), lung (35–39), head and neck (40), and abdomen (41). Significant changes in tumor volume definitions have been reported when PET images were used. PET can identify additional lesions or allow dose escalation by identifying necrotic tissue that does not need to be irradiated (39,42). Integrated PET/CT images allow the interactive definition of gross tumor volumes using synchronized multimodality displays. An example of PET data acquired on a stand-alone PET scanner and coregistered with the simulation CT scan is shown in Figure 6. Such coregistered images may be used for radiation treatment planning. The importance of more precise, functionally based treatment planning techniques will likely increase with further advances in radiation delivery, such as intensity-modulated radiotherapy and image-guided brachytherapy.

FIGURE 6.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 6.

Application of software registration in radiotherapy. (A) Stand-alone whole-body 18F-FDG PET and thoracic CT images are registered using mutual information method. Both emission (top left) and transmission maps (bottom left) were used by computer algorithm. Note flat shape of bed used for simulation CT. Similar flat-bed configuration was simulated on PET scanner using styrofoam insert. Patient was positioned with arms up during both scans. Simulation CT scan was acquired during normal breathing. 3D orthogonal slices are shown (right) demonstrating volume of tumor in right lung as smaller on PET than on CT, probably because of necrosis. (B) Subsequently, radiotherapy planning was performed with coregistered PET and CT. Coregistered images were transferred via DICOM protocol to treatment planning workstation, and tumor volume was delineated using both scans. PET data allowed reduction of treatment volume. Patient data acquired at London Regional Cancer Center and Hamilton Health Sciences Center, Ontario, Canada.

Cardiology

Although multimodality fusion is currently not used in clinical cardiac imaging, potential cardiac applications could prove both useful and practical. For example, perfusion defects defined by SPECT or PET could be matched with the location of stenosis obtained by coronary CT angiography (CTA). Because perfusion defects and stenotic lesions are often depicted with poor quality, the spatial correlation of these images could allow reconciliation of subtle or equivocal findings. Such techniques were proposed for the fusion of 2D x-ray angiography with SPECT (20) but are not yet used clinically because of difficulties in registration of the 2D x-ray projection data with perfusion data. Fully tomographic 3D CTA techniques may facilitate practical implementations (Fig. 7). Emerging cardiac MRI techniques, such as perfusion and viability imaging, may require direct comparison with SPECT and PET for patient follow-up and for validation of the new approaches (Fig. 8). Preliminary studies are also exploring the possibility of depicting vulnerable plaques in the coronary arteries, aorta, and carotids with 18F-FDG and correlating it with vessel anatomy obtained by CT (43,44) or correlating that information with MR or ultrasound images (45). An added complication in the alignment of cardiac images is the need to match the multiple cardiac phases by using both temporal and spatial information (46).

FIGURE 7.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 7.

Future applications for molecular-anatomic cardiac image fusion. Volume-rendered phantom study of biplane 3D vessel reconstruction of x-ray angiography is fused with 18F-FDG PET phantom scan containing simulated defect. Such displays also could be used with 3D CT angiography and SPECT or PET data to compare perfusion and viability defects with extent of stenosis. Phantom data acquired at Ottawa Heart Institute, Ontario, Canada.

FIGURE 8.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 8.

Coregistration of delayed contrast enhancement cardiac MR viability images with 18F-FDG viability images. (A) MRI, (B) (PET/MRI image fusion), and (C) PET demonstrate concordance, with reduced metabolic activity in regions of near-transmural myocardial scarring in septum and lateral wall (arrows), as seen in left ventricular short-axis imaging plane. Patient data acquired at Cedars-Sinai Medical Center, Los Angeles, CA.

FUTURE

Constant advances in diagnostic imaging and radiotherapy make it difficult to predict which clinical applications will benefit most from image fusion. We know that merging anatomic and molecular information is useful clinically, as recently demonstrated by hybrid PET/CT (47–49). The success of hybrid PET/CT would not be possible without ease of clinical operation and logistic simplicity. Therefore, retrospective software-based methods must match that simplicity by full automation and transparent access to data from all modalities to become accepted in clinical practice. MRI may become a leading modality in cardiology (50) and oncology (51) as some predict. Therefore, MRI/PET image registration may become of great importance. Radiotherapy departments may use their own specialized simulation CT-, tomotherapy-, or dedicated MRI-based systems for treatment planning and follow-up. Efforts are underway to use MRI as the primary modality for radiotherapy planning (52). PET/CT images could be coregistered to MRI with high accuracy in a nonlinear fashion, because both modalities contain correlating anatomic details. Perhaps the next generation of low-cost PET/CT systems could become more economical and more widely used by incorporating fast, ultra-low-dose, mid-quality CT for attenuation correction and for anatomic correlation. For selected patients, such images could be merged through retrospective registration with images from other modalities, such as multislice CT, fast electron-beam computed tomography for cardiac applications, tomotherapy images for radiation therapy planning, and specialized MRI scans.

Limitations

Perhaps the most difficult application for software registration is multimodality imaging of the head and neck, where significant nonlinear mismatches can occur as the result of different positions of the arms or head. Patient immobilization devices may need to be used to ensure correct image alignment (4). The abdominal and thoracic regions also can be significantly deformed, but many corresponding image features are present that can aid in achieving accurate superimposition. In the abdominal and pelvic regions, the required registration accuracy may be higher than in the thoracic region because of false-positive uptake in the bowels and urinary tract adjacent to possible sites of malignant lesions, such as in the ovaries. The lack of commercially available, easy-to-use, fully automatic software that has been customized and validated by vendors for specific clinical situations currently prohibits the widespread acceptance of image registration. The manual alignment of stand-alone data is time consuming and unreliable, because multiple parameters must be adjusted simultaneously. In addition, a trivial reason why software-based fusion has not been widely used in clinical practice is the incompatibility of various Digital Imaging and Communications in Medicine standard implementations and the lack of efficient multimodality Picture Archiving and Communication databases. Connectivity and compatibility between various diagnostic and therapeutic systems, cooperation between various departments, and flexibility in patient scheduling are essential for the routine use of software registration. These difficulties are mostly resolved with hybrid PET/CT scanners. Therefore, the software-based approaches may be difficult to implement clinically because of the logistic, economic, and political reasons rather than because of inherent limitations in image registration algorithms.

Software or Hardware?

Hybrid SPECT/CT and PET/CT scanners (53,54) have become widely used in diagnostic oncology applications. Software registration, however, may offer greater flexibility, for example, by allowing PET/MRI image combinations (17,55). MRI is superior to CT for oncologic brain imaging, and recent advances in MRI have also allowed fast thoracic and abdominal imaging with excellent speed and contrast (56). It may be necessary at times to acquire images in different body configurations for specific purposes, even when using a hybrid PET/CT system. For example, the CT may need to be performed in deep inspiration because of the superior image quality and potentially more precise dose delivery in radiation therapy (57,58). Such protocols would require nonlinear software registration of PET, even with the hybrid PET/CT data, or, alternatively, acquisition of gated PET (59) and gated CT (60). In addition, despite the rapid proliferation of hybrid PET/CT systems, there is a large installed base of stand-alone PET scanners for which retrospective image registration will always be required. In some departments, software registration and fusion are already used routinely with stand-alone PET scans (61).

Software registration may prove practical for clinical radiotherapy applications. Without the availability of dedicated PET/CT (including the appropriate staff) in the radiotherapy department, it may be necessary to retrospectively match the stand-alone PET or PET/CT scans to the CT simulation scans. However, the alignment accuracy of simulation CT and PET obtained on a hybrid scanner could presumably be higher than that obtained by software registration. The tumor volume definitions in radiation therapy are derived interactively with considerable interobserver variation (39). There are also errors in patient positioning. The required accuracy of software PET/CT registration for this application is not known and remains to be established. Software registration techniques also can be applied for follow-up assessment and comparison of serial scans during the course of therapy and presumably even for the correction of motion for data acquired on the PET/CT scanner.

CONCLUSION

Software image registration is a powerful and versatile tool that facilitates image fusion of several modalities in a variety of clinical situations. It can be used with stand-alone SPECT, PET, CT, or MRI data or with PET/CT scanners. Although image registration techniques have been applied primarily to brain images, other applications are now becoming practical. Connectivity, compatibility, and cooperation between various departments are essential for the clinical dissemination of the software-based image fusion. Software-based multimodality fusion may have several clinical applications: combination of MRI or CT with PET and SPECT for improved diagnosis for oncology, cardiology, and neurology; radiation therapy planning or assessment of therapy with PET combined with CT or MRI; and correction of nonlinear changes between anatomic and functional scans for both stand-alone modalities and hybrid PET/CT systems.

Footnotes

  • Received Sep. 18, 2003; revision accepted Nov. 7, 2003.

    For correspondence or reprints contact: Piotr J. Slomka, PhD, Department of Imaging, Cedars-Sinai Medical Center, A047 8700 Beverly Blvd., Los Angeles, CA 90048.

    E-mail: Piotr.Slomka{at}cshs.org

REFERENCES

  1. ↵
    Pelizzari CA, Chen GT, Spelbring DR, Weichselbaum RR, Chen CT. Accurate three-dimensional registration of CT, PET, and/or MR images of the brain. J Comput Assist Tomogr. 1989;13:20–26.
    OpenUrlCrossRefPubMed
  2. ↵
    Besl P, McKay N. A method for registration of 3D shapes. IEEE Trans Pattern Anal Machine Intell.1992;14:239–256.
    OpenUrlCrossRef
  3. ↵
    Mountz JM, Zhang B, Liu HG, Inampudi C. A reference method for correlation of anatomic and functional brain images: validation and clinical application. Semin Nucl Med.1994;24:256–271.
    OpenUrlPubMed
  4. ↵
    Sweeney RA, Bale R, Auberger T, et al. A simple and non-invasive vacuum mouthpiece-based head fixation system for high precision radiotherapy. Strahlenther Onkol.2001;177:43–47.
    OpenUrlCrossRefPubMed
  5. ↵
    Woods RP, Mazziotta JC, Cherry SR. MRI-PET registration with automated algorithm. J Comput Assist Tomogr.1993;17:536–546.
    OpenUrlCrossRefPubMed
  6. ↵
    Hill DL, Hawkes DJ, Gleeson MJ, et al. Accurate frameless registration of MR and CT images of the head: applications in planning surgery and radiation therapy. Radiology.1994;191:447–454.
    OpenUrlPubMed
  7. ↵
    Wells WM, Viola P, Atsumi H, Nakajima S, Kikinis R. Multi-modal volume registration by maximization of mutual information. Med Image Anal.1996;1:35–51.
    OpenUrlCrossRefPubMed
  8. ↵
    Maes F, Collignon A, Vandermeulen D, Marchal G, Suetens P. Multimodality image registration by maximization of mutual information. IEEE Trans Med Imaging.1997;16:187–198.
    OpenUrlCrossRefPubMed
  9. ↵
    Pluim JP, Maintz JB, Viergever MA. Mutual-information-based registration of medical images: a survey. IEEE Trans Med Imaging.2003;22:986–1004.
    OpenUrlCrossRefPubMed
  10. ↵
    Skalski J, Wahl RL, Meyer CR. Comparison of mutual information-based warping accuracy for fusing body CT and PET by 2 methods: CT mapped onto PET emission scan versus CT mapped onto PET transmission scan. J Nucl Med.2002;43:1184–1187.
    OpenUrlAbstract/FREE Full Text
  11. ↵
    Slomka PJ, Dey D, Przetak C, Aladl UE, Baum RP. Automated 3-dimensional registration of stand-alone 18F-FDG whole-body PET with CT. J Nucl Med.2003;44:1156–1167.
    OpenUrlAbstract/FREE Full Text
  12. ↵
    Tai YC, Lin KP, Hoh CK, Huang SC, Hoffman EJ. Utilization of 3D elastic transformation in the registration of chest x-ray CT and whole-body PET. IEEE Trans Nucl Science.1997;44:1606–1612.
    OpenUrl
  13. ↵
    Strother SC, Anderson JR, Xu XL, Liow JS, Bonar DC, Rottenberg DA. Quantitative comparisons of image registration techniques based on high-resolution MRI of the brain. J Comput Assist Tomogr.1994;18:954–962.
    OpenUrlPubMed
  14. ↵
    West J, Fitzpatrick JM, Wang MY, et al. Retrospective intermodality registration techniques for images of the head: surface-based versus volume-based. IEEE Trans Med Imaging.1999;18:144–150.
    OpenUrlCrossRefPubMed
  15. ↵
    Studholme C, Hill DL, Hawkes DJ. Automated 3-D registration of MR and CT images of the head. Med Image Anal.1996;1:163–175.
    OpenUrlCrossRefPubMed
  16. ↵
    Bookstein FL. Principal warps: thin-plate splines and the decomposition of deformations. IEEE Trans Pattern Anal Machine Intell.1989;11:567–585.
    OpenUrlCrossRef
  17. ↵
    West J, Fitzpatrick JM, Wang MY, et al. Comparison and evaluation of retrospective intermodality brain image registration techniques. J Comput Assist Tomogr.1997;21:554–566.
    OpenUrlCrossRefPubMed
  18. ↵
    Dey D, Slomka PJ, Hahn LJ, Kloiber R. Automatic three-dimensional multimodality registration using radionuclide transmission CT attenuation maps: a phantom study. J Nucl Med.1999;40:448–455.
    OpenUrlAbstract/FREE Full Text
  19. ↵
    Nakamoto Y, Tatsumi M, Cohade C, Osman M, Marshall LT, Wahl RL. Accuracy of image fusion of normal upper abdominal organs visualized with PET/CT. Eur J Nucl Med Mol Imaging.2003;30:597–602.
    OpenUrlPubMed
  20. ↵
    Aguade S, Candell-Riera J, Faber TL, et al. Unified three-dimensional images of myocardial perfusion and coronary angiography. Rev Esp Cardiol.2002;55:258–265.
    OpenUrlPubMed
  21. ↵
    Minoshima S, Frey KA, Koeppe RA, Foster NL, Kuhl DE. A diagnostic approach in Alzheimer’s disease using three-dimensional stereotactic surface projections of fluorine-18-FDG PET. J Nucl Med.1995;36:1238–1248.
    OpenUrlAbstract/FREE Full Text
  22. ↵
    Enders F, Strengert M, Iserhardt-Bauer S, Aladl UE, Slomka PJ. Interactive volume rendering of multimodality cardiac data with the use of consumer graphics hardware. In: Hanson K, ed. SPIE 2003 Medical Imaging: Image Visualization. San Diego, CA: SPIE-International Society for Optical Engineering; 2003:119–128.
  23. ↵
    Hastreiter P, Ertl T. Integrated registration and visualization of medical image data. Proc Comput Graphics Int.1998;10:78–85.
    OpenUrl
  24. ↵
    Murphy M, O’Brien TJ, Morris K, Cook MJ. Multimodality image-guided epilepsy surgery. J Clin Neurosci.2001;8:534–538.
    OpenUrlCrossRefPubMed
  25. ↵
    Barthel H, Muller U, Wachter T, et al. Multimodal SPECT and MRT imaging data analysis for an improvement in the diagnosis of idiopathic Parkinson’s syndrome [in German]. Radiologe.2000;40:863–869.
    OpenUrlPubMed
  26. ↵
    Meltzer CC, Kinahan PE, Greer PJ, et al. Comparative evaluation of MR-based partial-volume correction schemes for PET. J Nucl Med.1999;40:2053–2065.
    OpenUrlAbstract/FREE Full Text
  27. ↵
    Fried I, Nenov VI, Ojemann SG, Woods RP. Functional MR and PET imaging of rolandic and visual cortices for neurosurgical planning. J Neurosurg.1995;83:854–861.
    OpenUrlPubMed
  28. ↵
    Zifko UA, Slomka PJ, Young GB, Reid RH, Bolton CF. Brain mapping of median nerve somatosensory evoked potentials with combined 99mTc-ECD single-photon emission tomography and magnetic resonance imaging. Eur J Nucl Med.1996;23:579–582.
    OpenUrlPubMed
  29. ↵
    Hamilton RJ, Sweeney PJ, Pelizzari CA, et al. Functional imaging in treatment planning of brain lesions. Int J Radiat Oncol Biol Phys.1997;37:181–188.
    OpenUrlPubMed
  30. ↵
    Grosu AL, Lachner R, Wiedenmann N, et al. Validation of a method for automatic image fusion (BrainLAB System) of CT data and 11C-methionine-PET data for stereotactic radiotherapy using a LINAC: first clinical experience. Int J Radiat Oncol Biol Phys.2003;56:1450–1463.
    OpenUrlCrossRefPubMed
  31. ↵
    Cai J, Chu JC, Recine D, et al. CT and PET lung image registration and fusion in radiotherapy treatment planning using the chamfer-matching method. Int J Radiat Oncol Biol Phys.1999;43:883–891.
    OpenUrlCrossRefPubMed
  32. ↵
    Wahl RL, Quint LE, Cieslak RD, Aisen AM, Koeppe RA, Meyer CR. Anatometabolic tumor imaging: fusion of FDG PET with CT or MRI to localize foci of increased activity. J Nucl Med.1993;34:1190–1197.
    OpenUrlAbstract/FREE Full Text
  33. ↵
    Hamilton RJ, Blend MJ, Pelizzari CA, Milliken BD, Vijayakumar S. Using vascular structure for CT-SPECT registration in the pelvis. J Nucl Med.1999;40:347–351.
    OpenUrlAbstract/FREE Full Text
  34. ↵
    Meirovitz A, Troyer S, Evans V, et al. Rectum and prostate separation by MRI vs. CT in external beam and post-implant patients [abstract]. Int J Radiat Oncol Biol Phys.2003;57:S334.
    OpenUrl
  35. ↵
    Nestle U, Walter K, Schmidt S, et al. 18F-deoxyglucose positron emission tomography (FDG-PET) for the planning of radiotherapy in lung cancer: high impact in patients with atelectasis. Int J Radiat Oncol Biol Phys.1999;44:593–597.
    OpenUrlCrossRefPubMed
  36. Meyer CR, Boes JL, Kim B, et al. Demonstration of accuracy and clinical versatility of mutual information for automatic multimodality image fusion using affine and thin-plate spline warped geometric deformations. Med Image Anal.1997;1:195–206.
    OpenUrlCrossRefPubMed
  37. Caldwell CB, Mah K, Ung YC, et al. Observer variation in contouring gross tumor volume in patients with poorly defined non-small-cell lung tumors on CT: the impact of 18FDG-hybrid PET fusion. Int J Radiat Oncol Biol Phys.2001;51:923–931.
    OpenUrlCrossRefPubMed
  38. Mah K, Caldwell CB, Ung YC, et al. The impact of (18)FDG-PET on target and critical organs in CT-based treatment planning of patients with poorly defined non-small-cell lung carcinoma: a prospective study. Int J Radiat Oncol Biol Phys.2002;52:339–350.
    OpenUrlCrossRefPubMed
  39. ↵
    Erdi YE, Rosenzweig K, Erdi AK, et al. Radiotherapy treatment planning for patients with non-small cell lung cancer using positron emission tomography (PET). Radiother Oncol.2002;62:51–60.
    OpenUrlCrossRefPubMed
  40. ↵
    Nishioka T, Shiga T, Shirato H, et al. Image fusion between 18FDG-PET and MRI/CT for radiotherapy planning of oropharyngeal and nasopharyngeal carcinomas. Int J Radiat Oncol Biol Phys.2002;53:1051–1057.
    OpenUrlCrossRefPubMed
  41. ↵
    Mutic S, Malyapa RS, Grigsby PW, et al. PET-guided IMRT for cervical carcinoma with positive para-aortic lymph nodes: a dose-escalation treatment planning study. Int J Radiat Oncol Biol Phys.2003;55:28–35.
    OpenUrlCrossRefPubMed
  42. ↵
    Schmucking M, Baum RP, Griesinger F, et al. Molecular whole-body cancer staging using positron emission tomography: consequences for therapeutic management and metabolic radiation treatment planning. Recent Results Cancer Res.2003;162:195–202.
    OpenUrlPubMed
  43. ↵
    Rudd JH, Warburton EA, Fryer TD, et al. Imaging atherosclerotic plaque inflammation with [18F]-fluorodeoxyglucose positron emission tomography. Circulation.2002;105:2708–2711.
    OpenUrlAbstract/FREE Full Text
  44. ↵
    Dunphy M, Freiman A, Larson SM, Strauss HW. Detecting F-18 FDG in the coronary arteries, aorta, carotids and illiac vessels: comparison to vascular calcification [abstract]. J Nucl Med.2003;44:58 P.
    OpenUrlAbstract/FREE Full Text
  45. ↵
    Slomka PJ, Mandel J, Downey D, Fenster A. Evaluation of voxel-based registration of 3-D power Doppler ultrasound and 3-D magnetic resonance angiographic images of carotid arteries. Ultrasound Med Biol.2001;27:945–955.
    OpenUrlPubMed
  46. ↵
    Aladl UE, Dey D, Slomka PJ. Four-dimensional multimodality image registration applied to gated SPECT and gated MRI. In: Hanson K, ed. Medical Imaging 2003: Image Processing. San Diego, CA: International Society for Optical Engineering; 2003:1166–1175.
  47. ↵
    Bar-Shalom R, Yefremov N, Guralnik L, et al. Clinical performance of PET/CT in evaluation of cancer: additional value for diagnostic imaging and patient management. J Nucl Med.2003;44:1200–1209.
    OpenUrlAbstract/FREE Full Text
  48. Hany TF, Steinert HC, Goerres GW, Buck A, von Schulthess GK. PET diagnostic accuracy: improvement with in-line PET-CT system: initial results. Radiology.2002;225:575–581.
    OpenUrlCrossRefPubMed
  49. ↵
    Cohade C, Wahl RL. Applications of positron emission tomography/computed tomography image fusion in clinical positron emission tomography: clinical use, interpretation methods, diagnostic improvements. Semin Nucl Med.2003;33:228–237.
    OpenUrlCrossRefPubMed
  50. ↵
    Pohost GM, Hung L, Doyle M. Clinical use of cardiovascular magnetic resonance. Circulation.2003;108:647–653.
    OpenUrlFREE Full Text
  51. ↵
    Padhani AR. Dynamic contrast-enhanced MRI in clinical oncology: current status and future directions. J Magn Reson Imaging.2002;16:407–422.
    OpenUrlCrossRefPubMed
  52. ↵
    Mah D, Steckner M, Palacio E, Mitra R, Richardson T, Hanks GE. Characteristics and quality assurance of a dedicated open 0.23 T MRI for radiation therapy simulation. Med Phys.2002;29:2541–2547.
    OpenUrlCrossRefPubMed
  53. ↵
    Patton JA, Delbeke D, Sandler MP. Image fusion using an integrated, dual-head coincidence camera with X-ray tube-based attenuation maps. J Nucl Med.2000;41:1364–1368.
    OpenUrlAbstract/FREE Full Text
  54. ↵
    Beyer T, Townsend DW, Brun T, et al. A combined PET/CT scanner for clinical oncology. J Nucl Med.2000;41:1369–1379.
    OpenUrlAbstract/FREE Full Text
  55. ↵
    Nelson SJ, Day MR, Buffone PJ, et al. Alignment of volume MR images and high-resolution [18F]fluorodeoxyglucose PET images for the evaluation of patients with brain tumors. J Comput Assist Tomogr.1997;21:183–191.
    OpenUrlCrossRefPubMed
  56. ↵
    Heidemann RM, Ozsarlak O, Parizel PM, et al. A brief review of parallel magnetic resonance imaging. Eur Radiol.2003;13:2323–2337.
    OpenUrlCrossRefPubMed
  57. ↵
    Mah D, Hanley J, Rosenzweig KE, et al. Technical aspects of the deep inspiration breath-hold technique in the treatment of thoracic cancer. Int J Radiat Oncol Biol Phys.2000;48:1175–1185.
    OpenUrlCrossRefPubMed
  58. ↵
    Hanley J, Debois MM, Mah D, et al. Deep inspiration breath-hold technique for lung tumors: the potential value of target immobilization and reduced lung density in dose escalation. Int J Radiat Oncol Biol Phys.1999;45:603–611.
    OpenUrlCrossRefPubMed
  59. ↵
    Nehmeh SA, Erdi YE, Ling CC, et al. Effect of respiratory gating on quantifying PET images of lung cancer. J Nucl Med.2002;43:876–881.
    OpenUrlAbstract/FREE Full Text
  60. ↵
    Low DA, Nystrom M, Kalinin E, et al. A method for the reconstruction of four-dimensional synchronized CT scans acquired during free breathing. Med Phys.2003;30:1254–1263.
    OpenUrlCrossRefPubMed
  61. ↵
    Przetak C, Baum R, Slomka P. Image fusion raises clinical value of PET. Diagn Imaging Eur.2001;5:10–15.
    OpenUrl
PreviousNext
Back to top

In this issue

Journal of Nuclear Medicine
Vol. 45, Issue 1 suppl
January 1, 2004
  • Table of Contents
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Nuclear Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Software Approach to Merging Molecular with Anatomic Information
(Your Name) has sent you a message from Journal of Nuclear Medicine
(Your Name) thought you would like to see the Journal of Nuclear Medicine web site.
Citation Tools
Software Approach to Merging Molecular with Anatomic Information
Piotr J. Slomka
Journal of Nuclear Medicine Jan 2004, 45 (1 suppl) 36S-45S;

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Software Approach to Merging Molecular with Anatomic Information
Piotr J. Slomka
Journal of Nuclear Medicine Jan 2004, 45 (1 suppl) 36S-45S;
Twitter logo Facebook logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • IMAGE REGISTRATION ALGORITHMS
    • VALIDATION
    • VISUALIZATION TECHNIQUES
    • CLINICAL APPLICATIONS FOR SOFTWARE REGISTRATION
    • FUTURE
    • CONCLUSION
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • SNMMI/ASNC/SCCT Guideline for Cardiac SPECT/CT and PET/CT 1.0
  • First Clinical Experience with Integrated Whole-Body PET/MR: Comparison to PET/CT in Patients with Oncologic Diagnoses
  • Dual-Modality Imaging: Combining Anatomy and Function
  • Introduction
  • The role of PET/CT scanning in radiotherapy planning
  • The Role of PET in Lymphoma
  • Semiautomated Analysis of Small-Animal PET Data
  • Optimized Contrast-Enhanced CT Protocols for Diagnostic Whole-Body 18F-FDG PET/CT: Technical Aspects of Single-Phase Versus Multiphase CT Imaging
  • Google Scholar

More in this TOC Section

  • Summary of Selected PET/CT Abstracts from the 2003 Society of Nuclear Medicine Annual Meeting
  • Implementing Biologic Target Volumes in Radiation Treatment Planning for Non-Small Cell Lung Cancer
Show more PET/CT: Imaging Function and Structure

Similar Articles

SNMMI

© 2025 SNMMI

Powered by HighWire