Anemia affects more than ¼ of the world’s population, mostly concentrated in low-resource areas, and carries serious health risks. Yet current screening methods are inadequate due to their inability to separate iron deficiency anemia (IDA) from genetic anemias such as thalassemia trait (TT), thus preventing targeted supplementation of oral iron. Here we present a cost-effective and accurate approach to diagnose anemia and anemia type using measures of cell morphology determined through machine learning applied to optical light scattering measurements. A partial least squares model shows that our system can accurately extract mean cell volume, red cell size heterogeneity, and mean cell hemoglobin concentration with high accuracy. These clinical parameters (or the raw data itself) can be submitted to machine learning algorithms such as quadratic discriminants or support vector machines to classify a patient into healthy, IDA, or TT. A clinical trial conducted on over 268 Chinese children, of which 49 had IDA and 24 had TT, shows >98% sensitivity and specificity for diagnosing anemia, with 81% sensitivity and 86% specificity for discriminating IDA and TT. The majority of the misdiagnoses are IDA patients with particularly severe anemia, possibly requring hospital care. Therefore, in a screening paradigm where anyone testing positive for TT is sent to the hospital for gold-standard diagnosis and care, we maximize patient benefit while minimizing use of scarce resources.
The use of modern medical equipment in crisis and war zones for emergency medical teams (EMT) of the World Health Organization is an important factor for fast and efficient humanitarian aid. A reliable vital parameter monitoring is fundamental in mobile hospitals. Currently, the maintenance of medical devices in structurally weak areas is difficult due to the company’s proprietary standards. Rough environmental influences such as dust, moisture, heat or shocks can lead to dysfunktion and long-lasting failure of instrumentation. Pulse oximetry and blood pressure measurements are particularly susceptible. We developed an open source vital parameter monitoring system for use under adverse conditions and structurally weak areas. Blood oxygen levels, heart rate, blood pressure and electrocardiograms are recorded and transferred to decentralized displays. The main focus is on reliability and robustness of various optical sensors for pulse oximetry, the repair capability of the system also for non-technical personnel and the availability of individual standard components. Therefore we implemented a monitoring system basing on individual microcontrollers for each vital parameter. Different optical sensors for measurement in transmission and reflection were tested at suitable body sites with near-surface arteries. In combination with the electrocardiogram, evaluation of the pulse transit time enables continuous blood pressure measurements. A specially developed optical reflective sensor allows reliable measurement of blood oxygen level. For extended blood pressure measurements, the pulsetransit-time method (PTT) was implemented and enables a continuous monitoring. Even in emergencies, the trend in blood pressure can be monitored with PTT without prior calibration. The reliability was investigated.
Retinal optical coherence tomography (OCT) is increasingly used for quantifying neuroaxonal damage in diseases of the central nervous system such as multiple sclerosis. High-quality OCT images are essential for accurate intraretinal segmentation and for correct quantification of retinal thickness changes. The quality of OCT images depends largely on the operator and patient compliance. Quality evaluation is time-consuming, and current OCT image quality criteria depend on the experience of the grader and are therefore subjective. The automatic graderindependent real-time feedback system for quality evaluation of retinal OCT images, AQuA, was developed to standardize quality evaluation and data accuracy. It classifies by signal quality, anatomical completeness and segmentation plausibility and has been validated by experienced graders. However, it is currently limited to OCT scans taken with one device from a single vendor. The aim of this work is to improve the capability of the AQuA quality classifier to generalize to new data, by developing a convolutional neural network (CNN), AQuANet. Moreover, this CNN may serve as a basic quality classifier, that can be adapted to specific problems by transfer learning. AQuANet is trained on A-Scan batches with quality labels automatically obtained with AQuA. Thus, a large set of training data of about 13000 A-Scan batches could be used, leading to an accuracy of 99.53%.
Pupil phase masks for enhanced depth of field microscopy were investigated by using a spatial light modulator. The
phase masks were evaluated with simulations in terms of the mean square error between in-focus and out-of-focus point
spread functions. The resulting best-performing phase masks were tested for fluorosphere samples using a microscope
add-on containing the SLM. First, z-stacks of fixed fluorospheres in an agarose medium were recorded in order to
measure the extended depth of field. The same measurements were also performed on fluorospheres subjected to
Brownian motion in an aqueous solution. The results show that with deconvolution and appropriate filtering it is possible
to obtain sharp fluorosphere images with an extended depth of field of at least 10 μm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.