In the study, we first introduce a novel AI-based system (MOM-ClaSeg) for multiple abnormality/disease detection and diagnostic report generation on PA/AP CXR images, which was recently developed by applying augmented Mask RCNN deep learning and Decision Fusion Networks. We then evaluate performance of MOM-ClaSeg system in assisting radiologists in image interpretation and diagnostic report generation through a multi-reader-multi-case (MRMC) study. A total of 33,439 PA/AP CXR images were retrospectively collected from 15 hospitals, which were divided into an experimental group of 25,840 images and a control group of 7,599 images with and without processed by MOM-ClaSeg system, respectively. In this MRMC study, 6 junior radiologists (5~10yr experience) first read these images and generated initial diagnostic reports with/without viewing MOM-ClaSeg-generated results. Next, the initial reports were reviewed by 2 senior radiologists (>15yr experience) to generate final reports. Additionally, 3 consensus expert radiologists (>25yr experience) reconciled the potential difference between initial and final reports. Comparison results showed that usingMOM-ClaSeg, diagnostic sensitivity of junior radiologists increased significantly by 18.67% (from 70.76% to 89.43%, P<0.001), while specificity decreased by 3.36% (from 99.49% to 96.13%, P<0.001). Average reading/diagnostic time in experimental group with MOM-ClaSeg reduced by 27.07% (P<0.001), with a particularly significant reduction of 66.48% (P<0.001) on abnormal images, indicating that MOM-ClaSeg system has potential for fast lung abnormality/disease triaging. This study demonstrates feasibility of applying the first AI-based system to assist radiologists in image interpretation and diagnostic report generation, which is a promising step toward improved diagnostic performance and productivity in future clinical practice.
Chest x-ray radiography (CXR) is widely used in screening and detecting lung diseases. However, reading CXR images is often difficult resulting in diagnostic errors and inter-reader variability. To address this clinical challenge, a Multi-task, Optimal-recommendation, and Max-predictive Classification and Segmentation (MOM-ClaSeg) system is developed to detect and delineate different abnormal regions of interest (ROIs) on CXR, make multiple recommendations of abnormalities sorted by the generated probability scores, and automatically generate diagnostic report. MOM-ClaSeg consists of convolutional neural networks to generate a detection, finer-grained segmentation and prediction score for each ROI based on augmented MaskRCNN framework, and multi-layer perception neural networks to fuse results to generate the optimal recommendation for each detected ROI based on decision fusion framework. Total of 310,333 adult CXR containing 67,071 normal and 243,262 abnormal images depicting 307,415 confirmed ROIs of 65 different abnormalities were assembled as to train MOM-ClaSeg. An independent 22,642 CXR was assembled to test MOMClaSeg. Radiologists detected 6,646 ROIs that depict 43 different types of abnormalities on 4,068 CXR images. Comparing with radiologists’ detection results, MOM-ClaSeg system detected 6,009 true-positive ROIs and 6,379 false-positive ROIs, which represents 90.3% sensitivity and 0.28 false-positive ROIs per image. For the eight common diseases, the computed areas under ROC curves ranged from 0.880 to 0.988. Additionally, 70.4% of MOM-ClaSeg system-detected abnormalities along with system-generated diagnostic reports were directly accepted by radiologists. This study presents the first AI-based multi-task prediction system to detect different abnormalities and generate diagnostic reports to assist radiologists accurately and/or efficiently detecting lung diseases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.