News | Artificial Intelligence | May 21, 2024

In a new study of nearly 5,000 screening mammograms interpreted by an FDA-approved AI algorithm, patient characteristics such as race and age influenced false positive results. The study’s results, “Patient Characteristics Impact Performance of AI Algorithm in Interpreting Negative Screening Digital Breast Tomosynthesis Studies” were published today in the RSNA journal Radiology.

According to a newly-published study of nearly 5,000 screening mammograms interpreted by an FDA-approved AI algorithm, patient characteristics such as race and age influenced false positive results.

In a newly-published study of nearly 5,000 screening mammograms interpreted by an FDA-approved AI algorithm, patient characteristics such as race and age influenced false positive results. The study’s results, “Patient Characteristics Impact Performance of AI Algorithm in Interpreting Negative Screening Digital Breast Tomosynthesis Studies” were published today in the RSNA journal Radiology. Image courtesy: RSNA Radiology (Nguyen, DL and Ren, Y et al)


May 21, 2024 — According to a newly-published study of nearly 5,000 screening mammograms interpreted by an FDA-approved AI algorithm, patient characteristics such as race and age influenced false positive results. The study’s results, “Patient Characteristics Impact Performance of AI Algorithm in Interpreting Negative Screening Digital Breast Tomosynthesis Studies” were published today in Radiology, a journal of the Radiological Society of North America (RSNA).

“AI has become a resource for radiologists to improve their efficiency and accuracy in reading screening mammograms while mitigating reader burnout,” said Derek L. Nguyen, M.D., Duke Health, breast radiologist and assistant professor at Duke University in Durham, NC. “However, the impact of patient characteristics on AI performance has not been well studied,” added Nguyen.

Key Results

- In a retrospective study including 4855 breast screening patients, false-positive case scores were more likely in Black patients (odds ratio [OR] = 1.5) and less likely in Asian patients (OR = 0.7) compared with White patients.

- False-positive risk scores were more likely in Black patients (OR = 1.5) and patients with extremely dense breasts (OR = 2.8) compared with White patients and patients with fatty density breasts, respectively.

- The influence of patient characteristics on algorithm performance necessitates more demographically diverse data sets for testing and training and greater transparency.

In a written summary of the findings issued by RSNA, Nguyen said while preliminary data suggests that AI algorithms applied to screening mammography exams may improve radiologists’ diagnostic performance for breast cancer detection and reduce interpretation time, there are some aspects of AI to be aware of.

“There are few demographically diverse databases for AI algorithm training, and the FDA does not require diverse datasets for validation,” Nguyen said, adding: “Because of the differences among patient populations, it’s important to investigate whether AI software can accommodate and perform at the same level for different patient ages, races and ethnicities.”

Collaborating with Nguyen were Lars J. Grimm, M.D., M.S.,  and Joseph Y. Lo, PhD, both with Duke University School of Medicine Department of Radidology; Tyler M. Jones, B.S., Duke University Pratt School of Engineering; Samantha M. Thomas, M.S., Duke University Department of Biostastics and Bioinformatics, and for iCAD Senior Research Scientist, Yinhao Ren, Ph.D.

AI Breast Imaging Study Details

In the retrospective study, researchers identified patients with negative (no evidence of cancer) digital breast tomosynthesis screening examinations performed at Duke University Medical Center between 2016 and 2019. All patients were followed for a two-year period after the screening mammograms, and no patients were diagnosed with a breast malignancy.

The researchers randomly selected a subset of this group consisting of 4,855 patients (median age 54 years) broadly distributed across four ethnic/racial groups. The subset included 1,316 (27%) white, 1,261 (26%) Black, 1,351 (28%) Asian, and 927 (19%) Hispanic patients.

A commercially available AI algorithm interpreted each exam in the subset of mammograms, generating both a case score (or certainty of malignancy) and a risk score (or one-year subsequent malignancy risk).

“Our goal was to evaluate whether an AI algorithm’s performance was uniform across age, breast density types and different patient race/ethnicities,” Nguyen said.

Given all mammograms in the study were negative for the presence of cancer, anything flagged as suspicious by the algorithm was considered a false positive result. False positive case scores were significantly more likely in Black and older patients (71-80 years) and less likely in Asian patients and younger patients (41-50 years) compared to white patients and women between the ages of 51 and 60.

“This study is important because it highlights that any AI software purchased by a healthcare institution may not perform equally across all patient ages, races/ethnicities and breast densities,” Nguyen said. Notably, he added, “Moving forward, I think AI software upgrades should focus on ensuring demographic diversity.”

Nguyen said healthcare institutions should understand the patient population they serve before purchasing an AI algorithm for screening mammogram interpretation and ask vendors about their algorithm training.

“Having a baseline knowledge of your institution’s demographics and asking the vendor about the ethnic and age diversity of their training data will help you understand the limitations you’ll face in clinical practice,” he said. 

More information: www.rsna.org

Reference: https://doi.org/10.1148/radiol.232286


Related Content

News | Ultrasound Imaging

June 3, 2025 — In a collaborative study between the Departments of Radiology at the Children’s Hospital of Philadelphia ...

Time June 04, 2025
arrow
News | Breast Imaging

June 2, 2025 — Clairity, Inc., a digital health innovator advancing AI-driven healthcare solutions, has received U.S ...

Time June 02, 2025
arrow
News | Magnetic Resonance Imaging (MRI)

Hyperfine, Inc., producer of the world’s first FDA-cleared AI-powered portable MRI system for the brain — the Swoop ...

Time May 29, 2025
arrow
News | Imaging Software Development

May 27, 2025 — DeepLook Medical, a company advancing medical imaging through visual enhancement technology, recently ...

Time May 28, 2025
arrow
News | Imaging Software Development

May 20, 2025 – Intelerad, a provider of medical imaging software solutions, recently announced its prime partnership ...

Time May 21, 2025
arrow
News | Teleradiology

May 21, 2025 — Konica Minolta Healthcare Americas, Inc and NewVue have announced the introduction of Exa Teleradiology ...

Time May 21, 2025
arrow
News | Computed Tomography (CT)

May 15, 2025 — GE HealthCare has launched CleaRecon DL, technology powered by a deep-learning algorithm, to improve the ...

Time May 15, 2025
arrow
News | Radiology Business

The issue of sustainability in healthcare has gained increasing focus over the past several years. During a 2022 plenary ...

Time May 06, 2025
arrow
News | Radiation Oncology

May 2, 2025 — GE HealthCare has announced an intended expansion of its radiation oncology portfolio as well as the ...

Time May 03, 2025
arrow
News | Cardiac Imaging

April 30, 2025 – Viz.ai, the leader in AI-powered disease detection and intelligent care coordination, has launched Viz ...

Time May 02, 2025
arrow
Subscribe Now