Conventional pathology workflows rely on two-dimensional, slide-based analysis of thin tissue sections. This approach comes with several key limitations including limited sampling, lack of 3D structural information, and destruction of valuable clinical specimens. There is growing interest in nondestructive 3D pathology to address these shortcomings. Existing work has mainly focused on small-scale proof-of-concept studies, due in part to the difficulty of producing consistent, high-quality 3D pathology datasets across hundreds to thousands of specimens. To facilitate large-scale clinical studies, we present an end-to-end workflow for 3D pathology, with an emphasis on data consistency and quality control.
KEYWORDS: Image segmentation, 3D modeling, Education and training, 3D image processing, Prostate, Data modeling, Biopsy, Pathology, Prostate cancer, Performance modeling
SignificanceIn recent years, we and others have developed non-destructive methods to obtain three-dimensional (3D) pathology datasets of clinical biopsies and surgical specimens. For prostate cancer risk stratification (prognostication), standard-of-care Gleason grading is based on examining the morphology of prostate glands in thin 2D sections. This motivates us to perform 3D segmentation of prostate glands in our 3D pathology datasets for the purposes of computational analysis of 3D glandular features that could offer improved prognostic performance.AimTo facilitate prostate cancer risk assessment, we developed a computationally efficient and accurate deep learning model for 3D gland segmentation based on open-top light-sheet microscopy datasets of human prostate biopsies stained with a fluorescent analog of hematoxylin and eosin (H&E).ApproachFor 3D gland segmentation based on our H&E-analog 3D pathology datasets, we previously developed a hybrid deep learning and computer vision-based pipeline, called image translation-assisted segmentation in 3D (ITAS3D), which required a complex two-stage procedure and tedious manual optimization of parameters. To simplify this procedure, we use the 3D gland-segmentation masks previously generated by ITAS3D as training datasets for a direct end-to-end deep learning-based segmentation model, nnU-Net. The inputs to this model are 3D pathology datasets of prostate biopsies rapidly stained with an inexpensive fluorescent analog of H&E and the outputs are 3D semantic segmentation masks of the gland epithelium, gland lumen, and surrounding stromal compartments within the tissue.ResultsnnU-Net demonstrates remarkable accuracy in 3D gland segmentations even with limited training data. Moreover, compared with the previous ITAS3D pipeline, nnU-Net operation is simpler and faster, and it can maintain good accuracy even with lower-resolution inputs.ConclusionsOur trained DL-based 3D segmentation model will facilitate future studies to demonstrate the value of computational 3D pathology for guiding critical treatment decisions for patients with prostate cancer.
Non-destructive 3D microscopy enables the accurate characterization of diagnostically and prognostically significant microstructures in clinical specimens with significantly increased volumetric coverage than traditional 2D histology. We are using open-top light-sheet microscopy to image prostate cancer biopsies and investigating the prognostic significance of 3D spatial features of nuclei within prostate cancer microstructures. Using a previously published 3D nuclear segmentation workflow, we identify a preliminary set of 3D graph-based nuclear features to quantify the 3D spatial arrangement of nuclei in prostate cancer biopsies. Using a machine classifier, we identify the features which prognosticate prostate cancer risk and demonstrate agreement with patient outcomes.
Open-Top Light-Sheet (OTLS) microscopy is an emerging technique for cleared-tissue imaging that alleviates many of the physical constraints on sample size and geometry associated with conventional light-sheet microscopes. Existing OTLS designs generally use either two orthogonal objectives or a single shared objective for illumination and collection, however these architectures have various limitations for moderate NA cleared-tissue imaging. We previously developed an alternative Nonorthogonal Dual-Objective (NODO) OTLS configuration with a number of advantages for moderate-NA imaging of cleared tissues. Here we describe our latest NODO system, which is optimized specifically for high-throughput imaging of clinical samples for 3D pathology.
Light-sheet microscopy (LSM) has emerged as the technique of choice for many biologists imaging large cleared tissues due to its speed and optical efficiency, which make it possible to generate massive datasets of large specimens at high resolution. Here, we build on several recent innovations in LSM to present a non-orthogonal dual-objective (NODO) LSM system with axial sweeping in an open-top configuration. This system is specifically designed to image large cleared brain tissues, such as for axonal connectomics, and provides subcellular resolution (0.3 µm lateral, 2 µm axial) of large cleared samples up to 8 mm thick.
Glandular architecture is currently the basis for the Gleason grading of prostate biopsies. To visualize and computationally analyze glandular features in large 3D pathology datasets, we developed an annotation-free segmentation method for 3D prostate glands that relies upon synthetic 3D immunofluorescence (IF) enabled by generative adversarial networks. By using a fluorescent analog of H and E (cheap and fast stain) as an input, our strategy allows for accurate glandular segmentation that does not rely upon subjective and tedious human annotations or slow and expensive 3D immunolabeling. We aim to demonstrate that this 3D segmentation will enable improved prostate cancer prognostication.
Fluorescence-based slide-free digital pathology techniques, including 3D microscopy, are gaining interest as alternatives to traditional slide-based histology. Since pathologists are accustomed to the appearance of standard histology stains, the ability to render grayscale fluorescent images with color palettes that mimic traditional histology is valuable. We present FalseColor-Python, an open-source rapid digital-staining package that renders two-channel fluorescence images to mimic standard histology. Our package offers consistent color-space representations that are robust to both intra-specimen and inter-specimen variations in staining and imaging conditions, along with GPU-accelerated methods to process large datasets efficiently.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.