News | Artificial Intelligence | December 03, 2019

AI Improves Chest X-ray Interpretation

#RSNA19 A sophisticated type of artificial intelligence (AI) can detect clinically meaningful chest X-ray findings as effectively as experienced radiologists, according to a study published in the journal Radiology.

Image courtesy of GE Healthcare

December 3, 2019 — A sophisticated type of artificial intelligence (AI) can detect clinically meaningful chest X-ray findings as effectively as experienced radiologists, according to a study published in the journal Radiology. Researchers said their findings, based on a type of AI called deep learning, could provide a valuable resource for the future development of AI chest radiography models.

Chest radiography, or X-ray, one of the most common imaging exams worldwide, is performed to help diagnose the source of symptoms like cough, fever and pain. Despite its popularity, the exam has limitations.

“We’ve found that there is a lot of subjectivity in chest X-ray interpretation,” said study co-author Shravya Shetty, an engineering lead at Google Health in Palo Alto, Calif. “Significant inter-reader variability and suboptimal sensitivity for the detection of important clinical findings can limit its effectiveness.”

Deep learning, a sophisticated type of AI in which the computer can be trained to recognize subtle patterns, has the potential to improve chest X-ray interpretation, but it too has limitations. For instance, results derived from one group of patients cannot always be generalized to the population at large.

Researchers at Google Health developed deep learning models for chest X-ray interpretation that overcome some of these limitations. They used two large datasets to develop, train and test the models. The first dataset consisted of more than 750,000 images from five hospitals in India, while the second set included 112,120 images made publicly available by the National Institutes of Health (NIH).

A panel of radiologists convened to create the reference standards for certain abnormalities visible on chest X-rays used to train the models.

“Chest X-ray interpretation is often a qualitative assessment, which is problematic from deep learning standpoint,” said Daniel Tse, M.D., product manager at Google Health. “By using a large, diverse set of chest X-ray data and panel-based adjudication, we were able to produce more reliable evaluation for the models.”

Tests of the deep learning models showed that they performed on par with radiologists in detecting four findings on frontal chest X-rays, including fractures, nodules or masses, opacity (an abnormal appearance on X-rays often indicative of disease) and pneumothorax (the presence of air or gas in the cavity between the lungs and the chest wall).

Radiologist adjudication led to increased expert consensus of the labels used for model tuning and performance evaluation. The overall consensus increased from just over 41 percent after the initial read to more than almost 97 percent after adjudication.

The rigorous model evaluation techniques have advantages over existing methods, researchers said. By beginning with a broad, hospital-based clinical image set, and then sampling a diverse set of cases and reporting population adjusted metrics, the results are more representative and comparable. Additionally, radiologist adjudication provides a reference standard that can be both more sensitive and more consistent than other methods.

“We believe the data sampling used in this work helps to more accurately represent the incidence for these conditions,” Tse said. “Moving forward, deep learning can provide a useful resource to facilitate the continued development of clinically useful AI models for chest radiography.”

“The NIH database is a very important resource, but the current labels are noisy, and this makes it hard to interpret the results published on this data,” Shetty said. “We hope that the release of our labels will help further research in this field.”

For more information: www.rsna.org

Related Content

Sponsored Content | Videos | Mammography | January 24, 2020
Imaging Technology News Contributing Editor Greg Freiherr interviewed...
he U.S. Food and Drug Administration (FDA) has issued a final order to reclassify medical image analyzers applied to mammography breast cancer, ultrasound breast lesions, radiograph lung nodules and radiograph dental caries detection, postamendments class III devices (regulated under product code MYN), into class II (special controls), subject to premarket notification

Image courtesy of iCAD

News | Computer-Aided Detection Software | January 22, 2020
January 22, 2020 — The U.S.
Medical imaging technology company Oxipit announced partnership with Swiss medical distribution company Healthcare Konnect to bring ChestEye AI imaging suite to healthcare institutions in Nigeria
News | Artificial Intelligence | January 22, 2020
January 22, 2020 — Medical imaging technology company Oxipit ann
Hitachi Healthcare Americas announced that it will create a new dedicated research and development facility within its North American headquarters facility in Twinsburg, Ohio
News | Radiology Business | January 21, 2020
January 21, 2020 — Hitachi Healthcare Americas announced that it will create a new dedicated research and development
Sponsored Content | Videos | Enterprise Imaging | January 20, 2020
GE Healthcare's iCenter is a cloud-based management software that provides 24/7 visibility to customers' visual and o
This is a lung X-ray reviewed automatically by artificial intelligence (AI) to identify a collapsed lung (pneumothorax) in the color coded area. This AI app from Lunit is awaiting final FDA review and in planned to be integrated into several vendors' mobile digital radiography (DR) systems. Fujifilm showed this software integrated as a work-in-progress into its mobile X-ray system at RSNA 2019. GE Healthcare has its own version of this software for its mobile r=ray systems that gained FDA in 2019.   #RSNA #

This is a lung X-ray reviewed automatically by artificial intelligence (AI) to identify a collapsed lung (pneumothorax) in the color coded area. This AI app from Lunit is awaiting final FDA review and in planned to be integrated into several vendors' mobile digital radiography (DR) systems. Fujifilm showed this software integrated as a work-in-progress into its mobile X-ray system at RSNA 2019. GE Healthcare has its own version of this software for its mobile r=ray systems that gained FDA in 2019.

Feature | RSNA | January 20, 2020 | Dave Fornell, Editor
Here are images of some of the newest new medical imaging technologies displayed on the expo floor at the ...
Nanox secures $26M supported by strategic investment from Foxconn, unveiling the Startek-inspired AI Biobed for early detection
News | X-Ray | January 16, 2020
January 16, 2020 — Nanox, an innovative medical imaging techn
Carestream’s X-ray digital tomosynthesis functionality creates three-dimensional datasets from digital radiography (DR) that can be scrolled through similar to computed tomography (CT) imaging. It received 510(k) clearance from the U.S. Food and Drug Administration (FDA) in January 2020. Digital tomosynthesis uses a single sweep of X-ray exposures and streamlines operator workflow by separating the process of DT exposure acquisition from image volume formation.
News | Digital Radiography (DR) | January 15, 2020
January 15, 2020 — Carestream’s X-ray digital tomosynthesis (DT) functionality, which creates three-dimensional datas