Paper
13 March 2018 Development of an augmented reality approach to mammographic training: overcoming some real world challenges
Author Affiliations +
Abstract
A dedicated workstation and its corresponding viewing software are essential requirements in breast screener training. A major challenge of developing further generic screener training technology (in particular, for mammographic interpretation training) is that high-resolution radiological images are required to be displayed on dedicated workstations whilst actual reporting of the images is generally completed on individual standard workstations. Due to commercial reasons, dedicated clinical workstations manufactured by leading international vendors tend not to have critical technical aspects divulged which would facilitate further integration of third party generic screener training technology. With standard workstations, it is noticeable that the conventional screener training depends highly on manual transcription so that traditional training methods can potentially be deficient in terms of real-time feedback and interaction. Augmented reality (AR) provides the ability to co-operate with both real and virtual environments, and therefore can supplement conventional training with virtual registered objects and actions. As a result, realistic screener training can co-operate with rich feedback and interaction in real time. Previous work1 has shown that it is feasible to employ an AR approach to deliver workstation-independent radiological screening training by superimposing appropriate feedback coupled with the use of interaction interfaces. The previous study addressed presence issues and provided an AR recognisable stylus which allowed for drawing interaction. As a follow-up, this study extends the AR method and investigates realistic effects and the impacts of environmental illumination, application performance and transcription. A robust stylus calibration method is introduced to address environmental changes over time. Moreover, this work introduces a completed AR workflow which allows real time recording, computer analysable training data and further recoverable transcription during post-training. A quantitative evaluation results show an accuracy of more than 80% of user-drawn points being located within a pixel distance of 5.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qiang Tang, Yan Chen, Gerald Schaefer, and Alastair G. Gale "Development of an augmented reality approach to mammographic training: overcoming some real world challenges", Proc. SPIE 10576, Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling, 105762M (13 March 2018); https://doi.org/10.1117/12.2293496
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Autoregressive models

Augmented reality

Calibration

Standards development

Visualization

Breast cancer

Data acquisition

Back to Top