This paper presents new imaging systems for the estimation of physiological random processes in medical imaging. In this work, a physiological random process is a sequence of biochemical interactions taking place inside a living organism. These interactions involve things such as proteins and enzymes, that behave differently in response to external stimuli (such as nutrients or administered drugs). Understanding how these physiological processes interact and evolve is critical in the development of effective therapeutic approaches. The general setup of our imaging systems includes a fast detector for the measurement of visible light from which to estimate various parameters about the radiation emitted by the physiological process(es) of interest. Our setup is applicable to imaging with different kinds of radiation, including gamma rays (SPECT and PET), and charged particles, such as alpha and beta particles. Parameters we are interested in estimating for these photons/particles go beyond the 2D or 3D position typically measured in medical imaging applications, and include the direction of propagation and photon/particle energy. Recent work has shown the advantage of measuring direction of propagation and photon/particle energy, in addition to just position. It has been shown that if these additional photon/particle parameters are taken into account during reconstruction, the null space of the imaging system is strongly reduced or eliminated. This reduction in null space is critical to adequately characterize complicated physiological processes.
Purpose: The goal of this research is to develop innovative methods of acquiring simultaneous multidimensional molecular images of several different physiological random processes (PRPs) that might all be active in a particular disease such as COVID-19.
Approach: Our study is part of an ongoing effort at the University of Arizona to derive biologically accurate yet mathematically tractable models of the objects of interest in molecular imaging and of the images they produce. In both cases, the models are fully stochastic, in the sense that they provide ways to estimate any estimable property of the object or image. The mathematical tool we use for images is the characteristic function, which can be calculated if the multivariate probability density function for the image data is known. For objects, which are functions of continuous variables rather than discrete pixels or voxels, the characteristic function becomes infinite dimensional, and we refer to it as the characteristic functional.
Results: Several innovative mathematical results are derived, in particular for simultaneous imaging of multiple PRPs. Then the application of these methods to cancers that disrupt the mammalian target of rapamycin signaling pathway and to COVID-19 are discussed qualitatively. One reason for choosing these two problems is that they both involve lipid rafts.
Conclusions: We found that it was necessary to employ a new algorithm for energy estimation to do simultaneous single-photon emission computerized tomography imaging of a large number of different tracers. With this caveat, however, we expect to be able to acquire and analyze an
unprecedented amount of molecular imaging data for an individual COVID patient.
The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control versus the probability of normal-tissue complications as the overall radiation dose level is varied, e.g., by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. This paper shows how TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy, AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. The mathematical analogy between response of observers to images and the response of tumors to distributions of a chemotherapy drug is exploited to obtain linear discriminant functions from which AUTOC can be calculated. Methods for using mathematical models of drug delivery and tumor response with imaging data to estimate patient-specific parameters that are needed for calculation of AUTOC are outlined. The implications of this viewpoint for clinical trials are discussed.
The statistics of detector outputs produced by an imaging system are derived from basic radiometric concepts and definitions. We show that a fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular, and wavelength variables. We begin the paper by recalling the concept of radiance in geometrical optics, radiology, physical optics, and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Building upon these concepts, we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors (capable of measuring radiance on a photon-by-photon basis). This allows us to rigorously show how the concept of radiance is related to the statistical properties of detector outputs and to the information content of a single detected photon. A Monte-Carlo technique, which is derived from the Boltzmann transport equation, is presented as a way to estimate probability density functions to be used in reconstruction from photon-processing data.
Scientific computing is rapidly advancing due to the introduction of powerful new computing hardware, such as graphics processing units (GPUs). Affordable thanks to mass production, GPU processors enable the transition to efficient parallel computing by bringing the performance of a supercomputer to a workstation. We elaborate on some of the capabilities and benefits that GPU technology offers to the field of biomedical imaging. As practical examples, we consider a GPU algorithm for the estimation of position of interaction from photomultiplier (PMT) tube data, as well as a GPU implementation of the MLEM algorithm for iterative image reconstruction.
There are two basic sources of uncertainty in cancer chemotherapy: how much of the therapeutic agent reaches the cancer cells, and how effective it is in reducing or controlling the tumor when it gets there. There is also a concern about adverse effects of the therapy drug. Similarly in external-beam radiation therapy or radionuclide therapy, there are two sources of uncertainty: delivery and efficacy of the radiation absorbed dose, and again there is a concern about radiation damage to normal tissues. The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control vs. the probability of normal-tissue complications as the overall radiation dose level is varied, e.g. by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. The TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. In this paper we discuss the potential of using mathematical models of drug delivery and tumor response with imaging data to estimate AUTOC for chemotherapy, again for a single patient. This approach provides a basis for truly personalized therapy and for rigorously assessing and optimizing the therapy regimen for the particular patient. A key role is played by Emission Computed Tomography (PET or SPECT) of radiolabeled chemotherapy drugs.
A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process sin spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon.
The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been
imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding
coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have
predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like
planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally
Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of
faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer
and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an
evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC)
curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement
noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions
between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley,
California, in March of 2012.
We propose a new post-processing technique for the detection of faint companions and the estimation of
their parameters from adaptive optics (AO) observations. We apply the optimal linear detector, which is the
Hotelling observer, to perform detection, astrometry and photometry on real and simulated data. The real
data was obtained from the AO system on the 3m Lick telescope1.
The Hotelling detector, which is a prewhitening matched filter, calculates the Hotelling test statistic which
is then compared to a threshold. If the test statistic is greater than the threshold the algorithm decides that a
companion is present. This decision is the main task performed by the Hotelling observer. After a detection is
made the location and intensity of the companion which maximise this test statistic are taken as the estimated
values.
We compare the Hotelling approach with current detection algorithms widely used in astronomy. We discuss
the use of the estimation receiver operating characteristic (EROC) curve in quantifying the performance of the
algorithm with no prior estimate of the companion's location or intensity. The robustness of this technique to
errors in point spread function (PSF) estimation is also investigated.
In objective or task-based assessment of image quality, figures of merit are defined by the performance of some specific observer on some task of scientific interest. This methodology is well established in medical imaging but is just beginning to be applied in astronomy. In this paper we survey the theory needed to understand the performance of ideal or ideal-linear (Hotelling) observers on detection tasks with adaptive-optical data. The theory is illustrated by discussing its application to detection of exoplanets from a sequence of short-exposure images.
SC1296: High-Performance Computing for Medical Imaging on Graphics Processing Units (GPU) with CUDA
This course covers the basic principles of graphics processing unit (GPU) programming with CUDA. To become familiar with the programming model, we will start with a simple example, to be followed by more in-depth topics related to GPU programming. Some applications to medical imaging will be presented. Anyone who wants to know how to parallelize their code and make it run 10 times faster by harnessing the massively parallel capabilities of modern GPUs, will benefit from taking this course.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.