Skip to main content

Main menu

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI

User menu

  • Subscribe
  • My alerts
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Nuclear Medicine
  • SNMMI
    • JNM
    • JNMT
    • SNMMI Journals
    • SNMMI
  • Subscribe
  • My alerts
  • Log in
  • My Cart
Journal of Nuclear Medicine

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Past Issues
    • JNM Supplement
    • SNMMI Annual Meeting Abstracts
    • Continuing Education
    • JNM Podcasts
  • Subscriptions
    • Subscribers
    • Institutional and Non-member
    • Rates
    • Journal Claims
    • Corporate & Special Sales
  • Authors
    • Submit to JNM
    • Information for Authors
    • Assignment of Copyright
    • AQARA requirements
  • Info
    • Reviewers
    • Permissions
    • Advertisers
  • About
    • About Us
    • Editorial Board
    • Contact Information
  • More
    • Alerts
    • Feedback
    • Help
    • SNMMI Journals
  • View or Listen to JNM Podcast
  • Visit JNM on Facebook
  • Join JNM on LinkedIn
  • Follow JNM on Twitter
  • Subscribe to our RSS feeds
Research ArticleThe State of the Art

The Rise of Molecular Image–Guided Robotic Surgery

Fijs W.B. van Leeuwen, Tessa Buckle, Matthias N. van Oosterom and Daphne D.D. Rietbergen
Journal of Nuclear Medicine October 2024, 65 (10) 1505-1511; DOI: https://doi.org/10.2967/jnumed.124.267783
Fijs W.B. van Leeuwen
1Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands; and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Tessa Buckle
1Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands; and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Matthias N. van Oosterom
1Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands; and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Daphne D.D. Rietbergen
1Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands; and
2Section of Nuclear Medicine, Department of Radiology, Leiden University Medical Center, Leiden, The Netherlands
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading

Visual Abstract

Figure
  • Download figure
  • Open in new tab
  • Download powerpoint

Abstract

Following early acceptance by urologists, the use of surgical robotic platforms is rapidly spreading to other surgical fields. This empowerment of surgical perception via robotic advances occurs in parallel to developments in intraoperative molecular imaging. Convergence of these efforts creates a logical incentive to advance the decades-old image-guided robotics paradigm. This yields new radioguided surgery strategies set to optimally exploit the symbiosis between the growing clinical translation of robotics and molecular imaging. These strategies intend to advance surgical precision by increasing dexterity and optimizing surgical decision-making. In this state-of-the-art review, topic-related developments in chemistry (tracer development) and engineering (medical device development) are discussed, and future scientific robotic growth markets for molecular imaging are presented.

  • robotic surgery
  • digital surgery
  • image-guided surgery
  • autonomous robot
  • molecular imaging

Today’s robotic surgery paradigm is the direct result of a decades-long drive toward minimally invasive treatment strategies. The enhanced dexterity and ergonomics that lie at the robot’s core have motivated an increasing number of surgical disciplines to pursue robotics, resulting in a global growth market for robotic surgery that already extends to the use of more than 9,000 robotic systems in at least 12 million surgeries so far (1). Currently, the standard is set by the telerobotic da Vinci platform (Intuitive Surgical), which incorporates a stereoscopic fluorescence camera. However, an increasing number of alternative robotic platforms are being, or have already been, translated into the clinical setting (2). The surgical use of robots has also evoked scientific interest; the number of publications related to robotic surgery has shown a steep upward trend since 2000 (2000–2004, 704 publications, vs. 2019–2023, 19,018 publications; search term, “robotic surgery” in PubMed).

In parallel to the development of robotic platforms, the surgical field is benefitting from the rise of intraoperative molecular imaging (IMI), a molecular imaging subdiscipline that enhances surgical perception. Most commonly, this is done by exploiting radioguidance and fluorescence guidance. Perception enhancement promises to improve target identification and, with that, surgical accuracy and oncologic outcomes. Increases in surgical precision can also lead to a reduction in the number and severity of surgically induced complications. Opportunely, the intent of robotic surgery to increase surgical dexterity and IMI to augment surgical perception converge, yielding the image-guided robotics (IGR) subdiscipline of molecular IGR. Target illumination in this subdiscipline is enabled using approved radiopharmaceuticals (e.g., for identification of nodal involvement in oncology (3,4)) and fluorescent dyes (e.g., for visualization of physiologic measures in an anastomosis or lymphangiography (5,6)). In addition, several experimental pharmaceuticals that target, for example, tumor tissue are under investigation. An overarching factor herein is that especially complex interventions that target small lesions have proven to be highly reliant on the insights provided by preoperative road maps based on SPECT/CT or PET/CT images (4). If detailed preoperative information is available, the surgical approach can be preplanned, or the likelihood of intraoperative target identification can even be predicted (7). During surgery, the general emphasis seems to lie in the use of the γ-emitting radioisotope 99mTc and the approved near-infrared dye indocyanine green, but other isotopes and fluorescent dyes have also been successfully used (6,8). Best-of-both-worlds IMI scenarios are offered by approaches that combine the benefits provided by radiopharmaceuticals (quantitative [pharmacokinetic] measures and in-depth detection) with those of fluorescent dyes (video-rate imaging and <1 cm superficial detection (8,9)). For example, the initial successes with hybrid sentinel node approaches in prostate cancer patients (10) have instigated the dissemination of this technique in other robotic-surgery indications such as esophageal (11) and bladder (12) cancer.

In a highly interactive intervention such as surgery, decision-making is based on the surgeon–patient interaction. Here, environmental perception is the root cause behind the surgical actions and is defined by the surgeons’ sensory responses in relation to the patient. This concept can be illustrated using an example provided by Bohg et al. (13) showing that shape recognition based on static imaging provides 49% accuracy in object recognition whereas rotation of 3-dimensional vision enriches perception with a corresponding increase to 72%. Furthermore, addition of tactile information (sensing) was shown to result in an eventual 99% accuracy. These insights in interactive perception can be effectively translated to IGR. To this end, static images allow identification of pharmaceutical- and radiopharmaceutical-avid surgical targets and their intensity of uptake, but to enhance the level of perception the static images require complementary imaging–sensing technologies—for example, in the form of counts/s—that support interactive tissue interpretation. Digitization of multisensory data, combined with artificial intelligence, subsequently offers advantages that pave the way toward a future in which perception-enhanced performance helps realize robotic autonomy, such as via autonomous implementation of image guidance to a level that supersedes that of surgeons.

In this state-of-the-art review, key perception-enhancing components in molecular IGR are addressed because these provide a means to advance the field of minimally invasive robotic surgery via an imaging input (Fig. 1). Key components such as pharmaceuticals and radiopharmaceuticals for target definition, perception enhancement, digitization of data streams, technology assessment, and automation are discussed, and their place in molecular image–guided surgery is emphasized. Translational value is pursued by examining clinical boundary conditions and by defining how these reflect on technologic design choices.

NOTEWORTHY

  • Advances made in intraoperative molecular imaging need to converge with the parallel technologic thrust in surgery that is driving the implementation of robotics across a broad number of clinical indications.

  • Improving the precision of surgical interventions means surgeons must be provided with tools (pharmaceuticals and medical devices) that help increase their perception of the surgical environment.

  • In an analogy to how touch enhances human vision, the surgical value of molecular imaging can be complemented with real-time sensory feedback coming from the robotic manipulators.

  • To quantify the performance increases that are being realized through surgical guidance, the field may have to look past traditional long-term outcome measures and complement these efforts with kinematic assessments that register improvements in the surgical actions.

  • Molecular imaging is a crucial component in delivering on the future promise of autonomous surgical robots that are capable of complex decision-making in a dynamic surgical environment.

FIGURE 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 1.

Systems engineering components in field of image-guided robotics. AI = artificial intelligence.

PHARMACEUTICALS AND RADIOPHARMACEUTICALS FOR TARGET DEFINITION

Perception starts with the ability to separate a target tissue from its surroundings. In IMI, this separation is enabled via the use of pharmaceuticals and radiopharmaceuticals that specifically highlight anatomic or disease-specific features, also called tracers. The development of such pharmaceuticals for surgical purposes finds its origin in radioimmunoguided surgery, a concept that was introduced in the late 1990 s and that covers both receptor-targeted and physiology-based approaches (14). The surgical employment of physiology-based approaches has particularly benefitted from the intrinsic pharmacokinetic properties of clinically approved fluorescent dyes such as indocyanine green and fluorescein (6). Today, the use of fluorescence is actively being expanded toward receptor-targeted applications as well. This generally results in the use of fluorescent analogs of receptor-targeted radiopharmaceuticals that are used in diagnostic nuclear medicine. Although many of these pharmaceuticals show promising results in the preclinical setting, only a handful managed to evolve further. After translation, even fewer break through the commercialization boundary and achieve widescale adoption. Nevertheless, emergence of new imaging pharmaceuticals supports widening of the surgical targets and dissemination of IGR to the indications that currently make up the robotic surgery market (urology, 23%; gynecology, 23%; general surgery, 19%; cardiothoracic surgery, 9%; and other indications, 26% (15); Fig. 2).

FIGURE 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 2.

Targets for IMI in relation to main surgical fields that use robotics. Percentages were obtained from Kuokkanen (15). CAIX = carbonic anhydrase IX; CEA = carcinoembryonic antigen; CXCR4 = chemokine receptor-4; EGFR = epidermal growth factor receptor; FAP = fibroblast antigen protein; GRP = gastrin-releasing peptide; PplX = protoporphyrin IX; SSTR = somatostatin receptor; VEGF = vascular endothelial growth factor.

The ability to sensitively detect radiopharmaceuticals when applied within a microdosing regimen (≤100 μg/patient (16)) greatly facilitates the translational aspects of radioguided surgery. Studies in nuclear medicine indicate that dosing influences the quality of imaging data, whereby an increase in dosing tends to negatively affect obtained results (17). A word of caution here is that a combination of small lesions, low receptor expression levels, and suboptimal tracer affinities could still result in false-negative outcomes (18). Fluorescent tracer derivatives tend to be more translationally impaired because their inferior detection sensitivity often is compensated for by application of therapeutic dosing regimens (mg/kg) (19). Recent dosing studies with the fluorescent prostate-specific membrane antigen (PSMA)–targeting tracers IS-002 (20) and OTL78 (21) indicate that the use of high doses tends to result in receptor oversaturation, which not only negatively affects signal-to-background ratios but also increases overtreatment (false-positive results).

It can be considered beneficial for surgical perception when diagnostic 3-dimensional images, provided by nuclear medicine, can be substantiated with dynamic intraoperative tracing or imaging findings. The correlation between pre- and intraoperative findings is, among others, supported by the availability of theranostic pairs of pharmaceuticals and radiopharmaceuticals that can be used at the same, or similar, dosing. A textbook example is the use of 68Ga-/18F-PSMA PET for diagnostics and 99mTc-/111In-PSMA for surgical radiotracing (3). For this tracer pair, it was recently shown that the SUVmax on 68Ga-/18F-PSMA PET directly relates to the 99mTc-PSMA signal intensities encountered during surgery (7). Uniquely, the common radiopharmaceutical optimization of the molar activity and specific activity allows accurate lesion identification without oversaturation (18). Important to realize is that intraoperative imaging technologies are not expected to reliably detect lesions not visible in, for example, preoperative PET/CT road maps (22). Following this rationale, theranostic pairs that use somatostatin-targeted neuroendocrine tumor targeting could be used to facilitate the resection of gastroenteropancreatic neuroendocrine tumors with, for example, 68Ga-DOTATOC or 99mTc-EDDA/HYNIC-octreotate (23). Looking ahead, various other diagnostic PET radiopharmaceuticals have 99mTc-containing analogs, indicating that more targets could be exploited for surgical guidance. Examples are 99mTc-pentixafor (target: chemokine receptor-4 expressed on, e.g., glioblastoma (24)), 99mTc-fibroblast antigen protein inhibitor 34 (target: fibroblast antigen protein, involved in >28 different cancer types (25)), 99mTc-folate (target: folate receptor (26)), 99mTc-DB15 (target: gastrin-releasing peptide receptor (27)), 99mTc-IMMU-4Fab′ (target: carcinoembryonic antigen (28)), and a variety of targets and radioisotopes previously pursued in radioimmunoguided surgery initiatives (29). As sensitivity and dosing seem to play a critical role in achieving a correlated pre- and intraoperative accuracy, it will be complex to create well-correlated theranostic pairs made of, for example, PET tracers (microdosing and depth-independent detection) and purely fluorescent tracers (therapeutic dosing and superficial detection only). This is certainly true for small molecules and may be dosing-dependent for monoclonal antibodies, making it a topic in need of further investigation. Hybrid tracers that contain both a radioactive and a fluorescent component, however, seem to provide a logical design strategy in the pursuit of tracers that can directly relate intraoperative fluorescence findings to findings of noninvasive preoperative imaging (9).

PERCEPTION-ENHANCING MODALITIES

Given the fact that tissue tends to move during surgery, static (preoperative) images provide only limited guidance. Technologies that enrich the robots’ perceptual abilities (e.g., sensing and vision) during the surgical act will help enhance the surgeon–patient interaction. To integrate such modalities in the robotic platform, several technical complexities need to be overcome—for example, accessibility of the target, perception of stiffness during surgical maneuvers (i.e., the fulcrum effect (30)), and limited freedom of movement.

Endoscopic vision has played a critical role in the success of surgical robotics. Video image guidance is facilitated via system-integrated endoscopes that provide 3-dimensional video streams of the surgical field. These endoscopes include traditional white-light imaging and, in some cases, fluorescence imaging (e.g., the Firefly cameras on the da Vinci Si, X, Xi, and SP systems [Intuitive] (31), the TIPCAM Rubina [Karl Storz] video endoscope on the HUGO RAS [Medtronic] system (32), and more recently the Versius Plus vLimeLite [CMR Surgical] system). The integration of fluorescence imaging and the ability to identify moving tissues at video rate has instigated a paradigm shift toward the acceptance of fluorescence guidance in surgery. This success underscores how the integration between the robot and perception-enhancing modalities fundamentally determines the usability and impact of a technology.

Perception is optimal when vision is combined with a sense of touch (e.g., palpation, pressure, temperature sensation, or pain). When the surgeon is placed behind a console at a distance from the patient, intentional interactions with the patient are typically made by the robot’s end-effectors, meaning the manipulating instruments. Unfortunately, the tool–patient interactions of, for example, the initial da Vinci platforms were deprived of sensations. The newest da Vinci platform now incorporates tactile feedback. A logical next step in IGR design is to further enrich manipulating instruments with alternative senses of touch (Fig. 3A), allowing them to fulfill a role as a multisensory surrogate for the surgeon’s hands. An advantage of surgical instruments is that they can be technically enhanced to incorporate sensing capabilities that go far beyond human sensory capabilities—for example, sensors that support the detection of pharmaceuticals and radiopharmaceuticals and in effect enable finger-tip molecular sensing. The first steps toward realizing sensory enrichment have already been made with tethered Drop-In ultrasound (33), γ- (34), β- (35), and fiber-confocal (36) probes; and the pursuit of second-generation click-on sensors that can be applied directly onto the surgical tool (Fig. 3B (37,38)). As expected, these sensory readouts demonstrated perception enhancement in applications that also used fluorescence vision (39). Clinical and preclinical evaluations indicate that sensory enrichment can be extended to a variety of molecular imaging signatures, such as β-particles, Raman spectroscopy, mass spectrometry, fluorescence, and fluorescence lifetime (34,40,41).

FIGURE 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 3.

Enhancing perception by combining vision with sensory information: vision and sensing, translating concepts from human to robot (A); evolution of γ-probes to molecular-sensing capabilities in robotic instruments (B).

In contrast to these efforts to enhance the surgeon’s perception during the intervention, off-line back-table (ex vivo) assessments are also increasingly being proposed for IGR (42). Promising examples include the use of Cerenkov imaging (43), confocal systems (36), small-bore PET systems (44), and freehand tissue scanning (45). Although presenting a confirmatory value regarding target removal (42), it remains difficult to relate these measurements to the intraoperative pose of the target.

DIGITIZATION OF DATA STREAMS

Uniquely, surgical robots provide the hardware and the computing power to support data integration. This makes them ideal as digital operating platforms that provide a means to absorb and merge or multiplex multisensory data streams (46), including data streams that are not related to imaging, such as patient history, anesthesiology, and logistics. These inputs can be converted to outputs that highlight findings and send alerts to the surgical staff. When combined with smart algorithms, pre-, intra-, and postsurgical data streams can be processed to unmatched levels of complexity (Fig. 4A). All these then come together in a “control tower” that helps to redefine how surgery is analyzed and could be performed (data–surgeon interaction (47)).

FIGURE 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 4.

Digital surgery. Multiplexing of surgery-related data streams and their interaction with surgeon (A), mixed-reality visualizations (B), robotic instrument tracking technologies (C), and use of artificial intelligence for automated image analysis in surgical video feeds (D). (Left image in B reprinted from (42), with no changes made; https://creativecommons.org/licenses/by/4.0/). P = performance; wf = weighing factor; Dx = dexterity; DM = decision-making.

The most straightforward example of imaging–data integration during robotic surgery is the split-screen visualization of preoperatively acquired imaging road maps (e.g., CT, lymphoscintigraphic, or SPECT/CT images) directly next to the surgical video feed (34). This strategy helps to actively relate diagnostic imaging information (static images) to the dynamic surgical environment. This can be further enhanced via the employment of augmented- or mixed-reality visualizations whereby the preoperative images are overlayed onto the surgical video feed (Fig. 4B (48)). Currently, this strategy most widely uses radiologic images (CT and MRI). Nevertheless, there are also examples of nuclear medicine–generated images being displayed over a fluorescence-enhanced video feed (49). Active positional tracking of both instruments and patient anatomy support Global Positioning System–like directional guidance (Fig. 4C). Such navigation strategies require a direct relation between the pose of the robot, the pose of the target during surgery, and the pose of the target during preoperative imaging. As this relation can be realized using rigid landmarks, it is routinely applied during, for example, orthopedic surgery, skull surgery, and neurosurgery (50). Unfortunately, implementation of image-to-patient registration in soft-tissue indications is still hampered by challenges related to deformations caused by positioning, insufflation, breathing, and the tissue movement of the surgical manipulation itself. This stresses the need for confirmatory surgical modalities such as fluorescence imaging or γ-tracing that can be used to correct the navigation accuracy in real time (49). Uniquely, the active tracking of the Drop-In γ-probe during surgery has opened the possibility to register its intraabdominal readout with its positional location. This feature, when complemented by freehand image-reconstruction algorithms, can enable an interaction-facilitated mixed-reality vision enhancement called robot-assisted SPECT (Fig. 4B, right (51)). A tomographic form of digital perception enhancement could in the future also benefit other robotic sensing modalities.

In surgical practice, it often remains challenging to interpret the collected data. When this is done incorrectly, this could lead to a false-negative (missed lesions) or false-positive (overtreatment) readout. In the context of data processing, computer vision algorithms such as artificial intelligence strategies can help support high-end feature extraction. Early examples include anatomy recognition, instrument segmentation (Figs. 4C and 4D) (7), and fluorescence intensity interpretation (52).

TECHNOLOGY ASSESSMENT

The societal incentive for the use of robotics is to offer precision enhancement and a decrease in short- and long-term complications. These features are not defined by first-in-human proof-of-concept data but rather by multivariate health technology assessments performed over a prolonged period (53); health technology assessment is a systematic and multidisciplinary evaluation of the properties of health technologies and interventions covering both their direct and indirect consequences, making it a bridge that connects the world of research to that of policy making. Assessment of the patient benefit embodies traditional outcome measures such as randomized retrospective analysis of databases for complications, quality-adjusted life years, and disease-free survival. For example, quality-adjusted life years have been used to clarify for which indications robotic surgery may (e.g., for prostatectomy (54)) or may not (e.g., for cystectomy (55)) be cost-efficient. The ability to provide high-end evidence on benefits for the patient or the treating physicians not only drives technologic dissemination (56) but also defines the ability to make a healthy business case for a technology. Here, it should be noted that something can be technologically superior but fail during translation simply because of financial reasons. Alternatively, technologies with seemingly poor business cases can make it simply because of financial backing and strong public relations efforts. Currently, we are in a situation in which commercial success has become the best measure of technologic value. When there is healthy competition in a market, one may argue that commercial success is driven by cost, which will ultimately benefit the health care systems and patients. But when technologic availability is limited, commercial interests may not always yield the best patient benefit. For IGR technologies to offer optimal benefit, it is highly desirable, or perhaps even necessary, to come up with objective means to score value. In this respect, shifting focus to the field of IMI immediately exposes a challenge because long-term technologic assessments are rarely reported. Despite countless new concepts, pharmaceuticals, and prototypes that have been evaluated in first-in-human clinical studies, only a handful of IMI procedures has been validated on the basis of outcome measures. Here again, progress is being limited by the fact that companies or investors are reluctant to support 5–10 y of clinical trials. Examples that were able to overcome this barrier are protoporphyrin IX photodynamic diagnostics, sentinel node procedures, and PSMA radioguided surgery procedures (3,10,57), and only the latter two have been evaluated in the context of IGR.

Unfortunately, traditional long-term patient outcome measures do not match well with the speed at which research and development activities is currently being conducted at innovation labs, start-ups, and companies. At the same time, the pursuit of novelty and intellectual property creates the danger that innovations are not validated according to the highest norms and standards, which can ultimately lead to late clinical failure. When assuming that the goal of IGR is to use perception enhancement to advance the surgeon–patient interaction, one can even claim that traditional patient outcome readouts provide an indirect measure for the technologic impact. This can instigate a search for alternative performance assessment strategies. If we look at the way technologic enhancement is assessed in areas such as sports and motor sports, it becomes clear that movement kinematics, conducted during the act, help provide a wealth of quantitative readouts regarding the performance. Because the surgeons’ skills are defined by dexterity (gesture) and decision-making (perception) (58), extraction of multidimensional kinematic metrics related to instrument movement (e.g., speed, path length, jerkiness, and directionality) provides a means to objectively assess how innovations alter the surgeon–robot interaction (7,59). This in turn can be predictive for the surgeon–patient interaction. Recently, such strategies have been successfully exploited to quantify how cues based on pharmaceutical and radiopharmaceutical signal intensities and signal-to-background ratios impact the surgical decision-making (7,60).

AUTOMATION

In industry, the term robotics goes together with the term automation. Nevertheless, today’s teleoperative surgical systems are classified as having no to low (tremor filtering) autonomy (level of autonomy, 0–1), meaning that the motions along the robotic links and joints remain fully controlled by the operating surgeon. The surgeon is also in charge of procedural planning and adaptation to changes in the environment that occur during the intervention. This is so even while the growing shortage in skilled surgical personnel, the ever-increasing procedural complexity, and rising health care costs provide a powerful incentive to move decision-making away from human supervisors (61). Endowing robots with full autonomy (level of autonomy, 5) thereby promises to democratize surgery, help make surgical quality ubiquitous, standardize outcomes, and reduce recurrences.

Beyond health care, perhaps the best-known example of using supervised autonomy in a dynamic environment is adaptive cruise control in a car (level of autonomy, 1). A clinical situation in which specific surgical subtasks are outsourced to the robot is the use of the Robodoc (Integrated Surgical Systems) (62) or AquaBeam (PROCEPT BioRobotics Corp.) (61) systems. For cars to advance to a higher level of autonomy, they require an exceptional level of sensory enrichment coupled with artificial intelligence–advanced data computing (Fig. 5 (63)). Subsequently, active interaction between the components of data-acquisition, processing, and automated perception assessments (i.e., decision-making) allow vehicles to cope with environmental variations. Translation of these concepts to a surgical robot demands a more intelligent interaction between the robot and the surgical environment (Fig. 5 (64)), something that can be facilitated by pharmaceuticals and molecular imaging or sensing technologies. Considering how easy it is for surgeons to overlook tumor fragments during surgery, control strategies that raise the diagnostic accuracy provide an obvious starting point when exploring surgical automation (65,66). Over time, such efforts will help, among others, to transfer the above-mentioned freehand IGR technologies into hands-free technologies that empower surgeons in their perception.

FIGURE 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 5.

Sensory experiences in self-driving cars vs. matching available, but often experimental, robotic surgery technologies. AI = artificial intelligence. *Technology under exploration in research setting.

The rise of autonomous vehicles poses obvious dilemmas with regard to liability and ethics (67). These topics are being critically examined by today’s lawmakers, starting with regulations concerning the use of artificial intelligence. As the act of surgery is fault-intolerant, emphasis should be put on addressing these dilemmas before robots are entrusted to reliably identify, and quickly react to, unpredictable clinical situations (61).

CONCLUSION

The rise of IGR offers the field of IMI unique (out-of-the-box) growth capabilities—not only in the traditional terms of pharmaceuticals and radiopharmaceuticals, engineering, physics, and expanding of clinical indications but also in terms of embracing up-and-coming digital, performance-guided, and autonomous-surgery paradigms. Exploration of these opportunities will likely help expand the impact that nuclear medicine and molecular imaging have on the future of patient care.

DISCLOSURE

This research was financially supported by a Netherlands Organization for Scientific Research TTW-VICI grant (grant TTW 16141) and a KIC grant (grant 01460928). No other potential conflict of interest relevant to this article was reported.

Footnotes

  • Published online Jul. 11, 2024.

  • © 2024 by the Society of Nuclear Medicine and Molecular Imaging.

REFERENCES

  1. 1.↵
    1. Guthart G
    . J.P. Morgan healthcare conference 2023. Intuitive website. https://isrg.intuitive.com/static-files/6683d2bb-75e2-4fa0-b0cd-463ead7c30a4. Published January 11, 2023. Accessed June 26, 2024.
  2. 2.↵
    1. Bhandari M,
    2. Zeffiro T,
    3. Reddiboina M
    . Artificial intelligence and robotic surgery: current perspective and future directions. Curr Opin Urol. 2020;30:48–54.
    OpenUrl
  3. 3.↵
    1. Berrens AC,
    2. Knipper S,
    3. Marra G,
    4. et al
    . State of the art in prostate-specific membrane antigen-targeted surgery: a systematic review. Eur Urol Open Sci. 2023;54:43–55.
    OpenUrl
  4. 4.↵
    1. Valdés Olmos RA,
    2. Rietbergen DDD,
    3. Rubello D,
    4. et al
    . Sentinel node imaging and radioguided surgery in the era of SPECT/CT and PET/CT: toward new interventional nuclear medicine strategies. Clin Nucl Med. 2020;45:771–777.
    OpenUrl
  5. 5.↵
    1. Lee YJ,
    2. van den Berg NS,
    3. Orosco RK,
    4. Rosenthal EL,
    5. Sorger JM
    . A narrative review of fluorescence imaging in robotic-assisted surgery. Laparosc Surg. 2021;5:31.
    OpenUrl
  6. 6.↵
    1. van Beurden F,
    2. van Willigen DM,
    3. Vojnovic B,
    4. et al
    . Multi-wavelength fluorescence in image-guided surgery, clinical feasibility and future perspectives. Mol Imaging. 2020;19:1536012120962333.
    OpenUrlCrossRef
  7. 7.↵
    1. Azargoshasb S,
    2. de Barros HA,
    3. Rietbergen DDD,
    4. et al
    . Artificial intelligence-supported video analysis as a means to assess the impact of DROP-IN image guidance on robotic surgeons: radioguided sentinel lymph node versus PSMA-targeted prostate cancer surgery. Adv Intel Syst. 2023;5:2300192.
    OpenUrl
  8. 8.↵
    1. Van Oosterom MN,
    2. Rietbergen DDD,
    3. Welling MM,
    4. Van Der Poel HG,
    5. Maurer T,
    6. Van Leeuwen FWB
    . Recent advances in nuclear and hybrid detection modalities for image-guided surgery. Expert Rev Med Devices. 2019;16:711–734.
    OpenUrl
  9. 9.↵
    1. van Leeuwen FWB,
    2. Schottelius M,
    3. Brouwer OR,
    4. et al
    . Trending: radioactive and fluorescent bimodal/hybrid tracers as multiplexing solutions for surgical guidance. J Nucl Med. 2020;61:13–19.
    OpenUrlAbstract/FREE Full Text
  10. 10.↵
    1. Mazzone E,
    2. Dell’Oglio P,
    3. Grivas N,
    4. et al
    . Diagnostic value, oncological outcomes and safety profile of image-guided surgery technologies during robot-assisted lymph node dissection with sentinel node biopsy for prostate cancer. J Nucl Med. 2021;62:1363–1371.
    OpenUrlAbstract/FREE Full Text
  11. 11.↵
    1. Overwater A,
    2. Weusten BL,
    3. Ruurda JP,
    4. et al
    . Feasibility of sentinel node navigated surgery in high-risk T1b esophageal adenocarcinoma patients using a hybrid tracer of technetium-99 m and indocyanine green. Surg Endosc. 2022;36:2671–2679.
    OpenUrl
  12. 12.↵
    1. Rietbergen DDD,
    2. van Gennep EJ,
    3. KleinJan GH,
    4. et al
    . Evaluation of the hybrid tracer indocyanine green-99mTc-nanocolloid for sentinel node biopsy in bladder cancer: a prospective pilot study. Clin Nucl Med. 2022;47:774–780.
    OpenUrl
  13. 13.↵
    1. Bohg J,
    2. Hausman K,
    3. Sankaran B,
    4. et al
    . Interactive perception: leveraging action in perception and perception in action. IEEE Trans Robot. 2017;33:1273–1291.
    OpenUrl
  14. 14.↵
    1. Sun D,
    2. Bloomston M,
    3. Hinkle G,
    4. et al
    . Radioimmunoguided surgery (RIGS), PET/CT image-guided surgery, and fluorescence image-guided surgery: past, present, and future. J Surg Oncol. 2007;96:297–308.
    OpenUrlCrossRefPubMed
  15. 15.↵
    1. Kuokkanen K
    . Global surgical robot market is forecasted to reach USD 17.6 Bn by 2028. Statzon website. https://statzon.com/insights/global-surgical-robot-market. Published May 3, 2022. Accessed June 26, 2024.
  16. 16.↵
    1. Krentz AJ,
    2. Heinemann L,
    3. Hompesch M
    1. Fleming GA
    . Regulatory considerations for early clinical development of drugs for diabetes, obesity, and cardiometabolic disorders. In: Krentz AJ, Heinemann L, Hompesch M, eds. Translational Research Methods for Diabetes, Obesity and Cardiometabolic Drug Development: A Focus on Early Phase Clinical Studies. Springer; 2014:283–304.
  17. 17.↵
    1. Luurtsema G,
    2. Pichler V,
    3. Bongarzone S,
    4. et al
    . EANM guideline for harmonisation on molar activity or specific activity of radiopharmaceuticals: impact on safety and imaging quality. EJNMMI Radiopharm Chem. 2021;6:34.
    OpenUrl
  18. 18.↵
    1. Buckle T,
    2. Rietbergen DDD,
    3. de Wit-van der Veen L,
    4. Schottelius M
    . Lessons learned in application driven imaging agent design for image-guided surgery. Eur J Nucl Med Mol Imaging. June 20, 2024 [Epub ahead of print].
  19. 19.↵
    1. KleinJan GH,
    2. Bunschoten A,
    3. van den Berg NS,
    4. et al
    . Fluorescence guided surgery and tracer-dose, fact or fiction? Eur J Nucl Med Mol Imaging. 2016;43:1857–1867.
    OpenUrl
  20. 20.↵
    1. Nguyen HG,
    2. van den Berg NS,
    3. Antaris AL,
    4. et al
    . First-in-human evaluation of a prostate-specific membrane antigen-targeted near-infrared fluorescent small molecule for fluorescence-based identification of prostate cancer in patients with high-risk prostate cancer undergoing robotic-assisted prostatectomy. Eur Urol Oncol. 2024;7:63–72.
    OpenUrl
  21. 21.↵
    1. Stibbe JA,
    2. de Barros HA,
    3. Linders DGJ,
    4. et al
    . First-in-patient study of OTL78 for intraoperative fluorescence imaging of prostate-specific membrane antigen-positive prostate cancer: a single-arm, phase 2a, feasibility trial. Lancet Oncol. 2023;24:457–467.
    OpenUrl
  22. 22.↵
    1. van Leeuwen FWB,
    2. Winter A,
    3. van Der Poel HG,
    4. et al
    . Technologies for image-guided surgery for managing lymphatic metastases in prostate cancer. Nat Rev Urol. 2019;16:159–171.
    OpenUrl
  23. 23.↵
    1. Cockburn KC,
    2. Toumi Z,
    3. Mackie A,
    4. Julyan P
    . Radioguided surgery for gastroenteropancreatic neuroendocrine tumours: a systematic literature review. J Gastrointest Surg. 2021;25:3244–3257.
    OpenUrl
  24. 24.↵
    1. Konrad M,
    2. Rinscheid A,
    3. Wienand G,
    4. et al
    . [99mTc]Tc-PentixaTec: development, extensive pre-clinical evaluation, and first human experience. Eur J Nucl Med Mol Imaging. 2023;50:3937–3948.
    OpenUrl
  25. 25.↵
    1. Trujillo-Benítez D,
    2. Luna-Gutiérrez M,
    3. Ferro-Flores G,
    4. et al
    . Design, synthesis and preclinical assessment of 99mTc-iFAP for in vivo fibroblast activation protein (FAP) imaging. Molecules. 2022;27:264.
    OpenUrl
  26. 26.↵
    1. Galt JR,
    2. Halkar RK,
    3. Evans CO,
    4. et al
    . In vivo assay of folate receptors in nonfunctional pituitary adenomas with 99mTc-folate SPECT/CT. J Nucl Med. 2010;51:1716–1723.
    OpenUrlAbstract/FREE Full Text
  27. 27.↵
    1. Nock BA,
    2. Kaloudi A,
    3. Kanellopoulos P,
    4. et al
    . [99mTc]Tc-DB15 in GRPR-targeted tumor imaging with SPECT: from preclinical evaluation to the first clinical outcomes. Cancers (Basel). 2021;13:5093.
    OpenUrl
  28. 28.↵
    1. Lechner P,
    2. Lind P,
    3. Snyder M,
    4. Haushofer H
    . Probe-guided surgery for colorectal cancer. Recent Results Cancer Res. 2000;157:273–280.
    OpenUrlPubMed
  29. 29.↵
    1. Povoski SP,
    2. Neff RL,
    3. Mojzisik CM,
    4. et al
    . A comprehensive overview of radioguided surgery using gamma detection probe technology. World J Surg Oncol. 2009;7:11.
    OpenUrlCrossRefPubMed
  30. 30.↵
    1. Nisky I,
    2. Huang F,
    3. Milstein A,
    4. Pugh CM,
    5. Mussa-Ivaldi FA,
    6. Karniel A
    . Perception of stiffness in laparoscopy: the fulcrum effect. Stud Health Technol Inform. 2012;173:313–319.
    OpenUrlPubMed
  31. 31.↵
    1. Meershoek P,
    2. KleinJan GH,
    3. van Willigen DM,
    4. et al
    . Multi-wavelength fluorescence imaging with a da Vinci Firefly: a technical look behind the scenes. J Robot Surg. 2021;15:751–760.
    OpenUrl
  32. 32.↵
    1. Raffaelli M,
    2. Gallucci P,
    3. Voloudakis N,
    4. et al
    . The new robotic platform Hugo™ RAS for lateral transabdominal adrenalectomy: a first world report of a series of five cases. Updates Surg. 2023;75:217–225.
    OpenUrl
  33. 33.↵
    1. Green ED,
    2. Paleri V,
    3. Hardman JC,
    4. et al
    . Integrated surgery and radiology: trans-oral robotic surgery guided by real-time radiologist-operated intraoral ultrasound. Oral Maxillofac Surg. 2020;24:477–483.
    OpenUrl
  34. 34.↵
    1. van Oosterom MN,
    2. Azargoshasb S,
    3. Slof LJ,
    4. van Leeuwen FWB
    . Robotic radioguided surgery: toward full integration of radio- and hybrid-detection modalities. Clin Transl Imaging. 2023;11:533–544.
    OpenUrl
  35. 35.↵
    1. Bertani E,
    2. Mattana F,
    3. Collamati F,
    4. et al
    . Radio-guided surgery with a new-generation β-probe for radiolabeled somatostatin analog, in patients with small intestinal neuroendocrine tumors. Ann Surg Oncol. 2024;31:4189–4196.
    OpenUrl
  36. 36.↵
    1. Hwang K,
    2. Seo Y-H,
    3. Kim DY,
    4. et al
    . Handheld endomicroscope using a fiber-optic harmonograph enables real-time and in vivo confocal imaging of living cell morphology and capillary perfusion. Microsyst Nanoeng. 2020;6:72.
    OpenUrl
  37. 37.↵
    1. Azargoshasb S,
    2. van Alphen S,
    3. Slof LJ,
    4. et al
    . The Click-On gamma probe, a second-generation tethered robotic gamma probe that improves dexterity and surgical decision-making. Eur J Nucl Med Mol Imaging. 2021;48:4142–4151.
    OpenUrl
  38. 38.↵
    1. van Oosterom MN,
    2. van Leeuwen SI,
    3. Mazzone E,
    4. et al
    . Click-on fluorescence detectors: using robotic surgical instruments to characterize molecular tissue aspects. J Robot Surg. 2023;17:131–140.
    OpenUrl
  39. 39.↵
    1. Dell’Oglio P,
    2. Meershoek P,
    3. Maurer T,
    4. et al
    . A DROP-IN gamma probe for robot-assisted radioguided surgery of lymph nodes during radical prostatectomy. Eur Urol. 2021;79:124–132.
    OpenUrl
  40. 40.↵
    1. Collamati F,
    2. Morganti S,
    3. van Oosterom MN,
    4. et al
    . First-in-human validation of a DROP-IN β-probe for robotic radioguided surgery: defining optimal signal-to-background discrimination algorithm. Eur J Nucl Med Mol Imaging. January 20, 2024 [Epub ahead of print].
  41. 41.↵
    1. Gorpas D,
    2. Phipps J,
    3. Bec J,
    4. et al
    . Autofluorescence lifetime augmented reality as a means for real-time robotic surgery guidance in human patients. Sci Rep. 2019;9:1187.
    OpenUrl
  42. 42.↵
    1. Dell’Oglio P,
    2. Mazzone E,
    3. Buckle T,
    4. et al
    . Precision surgery: the role of intra-operative real-time image guidance: outcomes from a multidisciplinary European consensus conference. Am J Nucl Med Mol Imaging. 2022;12:74–80.
    OpenUrl
  43. 43.↵
    1. Olde Heuvel J,
    2. de Wit-van der Veen BJ,
    3. van der Poel HG,
    4. et al
    . 68Ga-PSMA Cerenkov luminescence imaging in primary prostate cancer: first-in-man series. Eur J Nucl Med Mol Imaging. 2020;47:2624–2632.
    OpenUrl
  44. 44.↵
    1. Fragoso Costa P,
    2. Shi K,
    3. Holm S,
    4. et al
    . Surgical radioguidance with beta-emitting radionuclides; challenges and possibilities: a position paper by the EANM. Eur J Nucl Med Mol Imaging. January 8, 2024 [Epub ahead of print].
  45. 45.↵
    1. van Oosterom MN,
    2. Meershoek P,
    3. Welling MM,
    4. et al
    . Extending the hybrid surgical guidance concept with freehand fluorescence tomography. IEEE Trans Med Imaging. 2020;39:226–235.
    OpenUrl
  46. 46.↵
    1. Wendler T,
    2. van Leeuwen FWB,
    3. Navab N,
    4. van Oosterom MN
    . How molecular imaging will enable robotic precision surgery. Eur J Nucl Med Mol Imaging. 2021;48:4201–4224.
    OpenUrl
  47. 47.↵
    1. Lam K,
    2. Abràmoff MD,
    3. Balibrea JM,
    4. et al
    . A Delphi consensus statement for digital surgery. NPJ Digit Med. 2022;5:100.
    OpenUrl
  48. 48.↵
    1. Parekh P,
    2. Patel S,
    3. Patel N,
    4. Shah M
    . Systematic review and meta-analysis of augmented reality in medicine, retail, and games. Vis Comput Ind Biomed Art. 2020;3:21.
    OpenUrl
  49. 49.↵
    1. van Oosterom MN,
    2. van der Poel HG,
    3. Navab N,
    4. van de Velde CJH,
    5. van Leeuwen FWB
    . Computer-assisted surgery: virtual- and augmented-reality displays for navigation during urological interventions. Curr Opin Urol. 2018;28:205–213.
    OpenUrl
  50. 50.↵
    1. Picard F,
    2. Deakin AH,
    3. Riches PE,
    4. Deep K,
    5. Baines J
    . Computer assisted orthopaedic surgery: past, present and future. Med Eng Phys. 2019;72:55–65.
    OpenUrl
  51. 51.↵
    1. Azargoshasb S,
    2. Berrens A-C,
    3. Slof LJ,
    4. et al
    . Robot-assisted single photon emission computed tomography: integrating nuclear medicine in robotic urologic surgery. Eur Urol. 2024;85:503–505.
    OpenUrl
  52. 52.↵
    1. Madani A,
    2. Namazi B,
    3. Altieri MS,
    4. et al
    . Artificial intelligence for intraoperative guidance: using semantic segmentation to identify surgical anatomy during laparoscopic cholecystectomy. Ann Surg. 2022;276:363–369.
    OpenUrlCrossRef
  53. 53.↵
    1. Reeves B
    . Health-technology assessment in surgery. Lancet. 1999;353(suppl 1):SI3–SI5.
    OpenUrlPubMed
  54. 54.↵
    1. Lindenberg MA,
    2. Retèl VP,
    3. van der Poel HG,
    4. Bandstra F,
    5. Wijburg C,
    6. van Harten WH
    . Cost-utility analysis on robot-assisted and laparoscopic prostatectomy based on long-term functional outcomes. Sci Rep. 2022;12:7658.
    OpenUrl
  55. 55.↵
    1. Michels CTJ,
    2. Wijburg CJ,
    3. Hannink G,
    4. Witjes JA,
    5. Rovers MM,
    6. Grutters JPC
    . Robot-assisted versus open radical cystectomy in bladder cancer: an economic evaluation alongside a multicentre comparative effectiveness study. Eur Urol Focus. 2022;8:739–747.
    OpenUrl
  56. 56.↵
    1. Farinha R,
    2. Puliatti S,
    3. Mazzone E,
    4. et al
    . Potential contenders for the leadership in robotic surgery. J Endourol. 2022;36:317–326.
    OpenUrl
  57. 57.↵
    1. Gandhi S,
    2. Tayebi Meybodi A,
    3. Belykh E,
    4. et al
    . Survival outcomes among patients with high-grade glioma treated with 5-aminolevulinic acid–guided surgery: a systematic review and meta-analysis. Front Oncol. 2019;9:620.
    OpenUrlPubMed
  58. 58.↵
    1. Spencer F
    . Teaching and measuring surgical techniques: the technical evaluation of competence. Bull Am Coll Surg. 1978;63:9–12.
    OpenUrl
  59. 59.↵
    1. Hung AJ,
    2. Ma R,
    3. Cen S,
    4. Nguyen JH,
    5. Lei X,
    6. Wagner C
    . Surgeon automated performance metrics as predictors of early urinary continence recovery after robotic radical prostatectomy: a prospective bi-institutional study. Eur Urol Open Sci. 2021;27:65–72.
    OpenUrl
  60. 60.↵
    1. Azargoshasb S,
    2. Boekestijn I,
    3. Roestenberg M,
    4. et al
    . Quantifying the impact of signal-to-background ratios on surgical discrimination of fluorescent lesions. Mol Imaging Biol. 2023;25:180–189.
    OpenUrl
  61. 61.↵
    1. Jamjoom AAB,
    2. Jamjoom AMA,
    3. Thomas JP,
    4. et al
    . Autonomous surgical robotic systems and the liability dilemma. Front Surg. 2022;9:1015367.
    OpenUrl
  62. 62.↵
    1. Bargar WL,
    2. Bauer A,
    3. Börner M
    . Primary and revision total hip replacement using the Robodoc system. Clin Orthop. 1998;82–91.
  63. 63.↵
    1. Babak S-J,
    2. Hussain SA,
    3. Karakas B,
    4. Cetin S
    . Control of autonomous ground vehicles: a brief technical review. IOP Conf Series Mater Sci Eng. 2017;224:012029.
    OpenUrl
  64. 64.↵
    1. Hashimoto DA,
    2. Rosman G,
    3. Rus D,
    4. Meireles OR
    . Artificial intelligence in surgery: promises and perils. Ann Surg. 2018;268:70–76.
    OpenUrlCrossRefPubMed
  65. 65.↵
    1. Attanasio A,
    2. Scaglioni B,
    3. Momi ED,
    4. Fiorini P,
    5. Valdastri P
    . Autonomy in surgical robotics. Annu Rev Control Robot Auton Syst. 2021;4:651–679.
    OpenUrl
  66. 66.↵
    1. Jiang Z,
    2. Salcudean SE,
    3. Navab N
    . Robotic ultrasound imaging: state-of-the-art and future perspectives. Med Image Anal. 2023;89:102878.
    OpenUrl
  67. 67.↵
    1. Pattinson J-A,
    2. Chen H,
    3. Basu S
    . Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces. Humanit Soc Sci Commun. 2020;7:153.
    OpenUrl
  • Received for publication March 28, 2024.
  • Accepted for publication June 5, 2024.
PreviousNext
Back to top

In this issue

Journal of Nuclear Medicine: 65 (10)
Journal of Nuclear Medicine
Vol. 65, Issue 10
October 1, 2024
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Complete Issue (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Nuclear Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
The Rise of Molecular Image–Guided Robotic Surgery
(Your Name) has sent you a message from Journal of Nuclear Medicine
(Your Name) thought you would like to see the Journal of Nuclear Medicine web site.
Citation Tools
The Rise of Molecular Image–Guided Robotic Surgery
Fijs W.B. van Leeuwen, Tessa Buckle, Matthias N. van Oosterom, Daphne D.D. Rietbergen
Journal of Nuclear Medicine Oct 2024, 65 (10) 1505-1511; DOI: 10.2967/jnumed.124.267783

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
The Rise of Molecular Image–Guided Robotic Surgery
Fijs W.B. van Leeuwen, Tessa Buckle, Matthias N. van Oosterom, Daphne D.D. Rietbergen
Journal of Nuclear Medicine Oct 2024, 65 (10) 1505-1511; DOI: 10.2967/jnumed.124.267783
Twitter logo Facebook logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Visual Abstract
    • Abstract
    • PHARMACEUTICALS AND RADIOPHARMACEUTICALS FOR TARGET DEFINITION
    • PERCEPTION-ENHANCING MODALITIES
    • DIGITIZATION OF DATA STREAMS
    • TECHNOLOGY ASSESSMENT
    • AUTOMATION
    • CONCLUSION
    • DISCLOSURE
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • A Vision for Gastrin-Releasing Peptide Receptor Targeting for Imaging and Therapy: Perspective from Academia and Industry
  • Treatment Landscape of Prostate Cancer in the Era of PSMA Radiopharmaceutical Therapy
  • Theranostics for Neuroblastoma: Making Molecular Radiotherapy Work Better
Show more The State of the Art

Similar Articles

Keywords

  • robotic surgery
  • digital surgery
  • image-guided surgery
  • autonomous robot
  • Molecular imaging
SNMMI

© 2025 SNMMI

Powered by HighWire