Paper
12 March 2018 Hierarchical model-based object localization for auto-contouring in head and neck radiation therapy planning
Yubing Tong, Jayaram K. Udupa, Xingyu Wu, Dewey Odhner, Gargi Pednekar, Charles B. Simone, David McLaughlin, Chavanon Apinorasethkul, Geraldine Shammo, Paul James, Joseph Camaratta, Drew A. Torigian
Author Affiliations +
Abstract
Segmentation of organs at risk (OARs) is a key step during the radiation therapy (RT) treatment planning process. Automatic anatomy recognition (AAR) is a recently developed body-wide multiple object segmentation approach, where segmentation is designed as two dichotomous steps: object recognition (or localization) and object delineation. Recognition is the high-level process of determining the whereabouts of an object, and delineation is the meticulous lowlevel process of precisely indicating the space occupied by an object. This study focuses on recognition.

The purpose of this paper is to introduce new features of the AAR-recognition approach (abbreviated as AAR-R from now on) of combining texture and intensity information into the recognition procedure, using the optimal spanning tree to achieve the optimal hierarchy for recognition to minimize recognition errors, and to illustrate recognition performance by using large-scale testing computed tomography (CT) data sets. The data sets pertain to 216 non-serial (planning) and 82 serial (re-planning) studies of head and neck (H&N) cancer patients undergoing radiation therapy, involving a total of ~2600 object samples. Texture property “maximum probability of occurrence” derived from the co-occurrence matrix was determined to be the best property and is utilized in conjunction with intensity properties in AAR-R. An optimal spanning tree is found in the complete graph whose nodes are individual objects, and then the tree is used as the hierarchy in recognition. Texture information combined with intensity can significantly reduce location error for glandrelated objects (parotid and submandibular glands). We also report recognition results by considering image quality, which is a novel concept. AAR-R with new features achieves a location error of less than 4 mm (~1.5 voxels in our studies) for good quality images for both serial and non-serial studies.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yubing Tong, Jayaram K. Udupa, Xingyu Wu, Dewey Odhner, Gargi Pednekar, Charles B. Simone, David McLaughlin, Chavanon Apinorasethkul, Geraldine Shammo, Paul James, Joseph Camaratta, and Drew A. Torigian "Hierarchical model-based object localization for auto-contouring in head and neck radiation therapy planning", Proc. SPIE 10578, Medical Imaging 2018: Biomedical Applications in Molecular, Structural, and Functional Imaging, 1057822 (12 March 2018); https://doi.org/10.1117/12.2294042
Lens.org Logo
CITATIONS
Cited by 10 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image quality

Computed tomography

Data modeling

Neck

Image segmentation

Radiotherapy

Head

Back to Top