A new artificial intelligence (AI) model combines imaging information with clinical patient data to improve diagnostic performance on chest X-rays

Representative radiographs (top), acquired in anteroposterior projection in the supine position, and corresponding attention maps (bottom). (A) Images show main diagnostic findings of the internal data set in a 49-year-old male patient with congestion, pneumonic infiltrates, and effusion (left); a 64-year-old male patient with congestion, pneumonic infiltrates, and effusion (middle); and a 69-year-old female patient with effusion (right). (B) Images show main diagnostic findings of the Medical Information Mart for Intensive Care data set in a 79-year-old male patient with cardiomegaly and pneumonic infiltrates in the right lower lung (left); a 58-year-old female patient with bilateral atelectasis and effusion in the lower lungs (middle); and a 48-year-old female patient with pneumonic infiltrates in the lower right lung (right). Note that the attention maps consistently focus on the most relevant image regions (eg, pneumonic opacities are indicated by opaque image regions of the lung). Image courtesy of RSNA 


October 4, 2023 — A new artificial intelligence (AI) model combines imaging information with clinical patient data to improve diagnostic performance on chest X-rays, according to a study published in Radiology, a journal of the Radiological Society of North America (RSNA). 

Clinicians consider both imaging and non-imaging data when diagnosing diseases. However, current AI-based approaches are tailored to solve tasks with only one type of data at a time. 

Transformer-based neural networks, a relatively new class of AI models, have the ability to combine imaging and non-imaging data for a more accurate diagnosis. These transformer models were initially developed for the computer processing of human language. They have since fueled large language models like  ChatGPT and Google’s AI chat service, Bard. 

“Unlike convolutional neural networks, which are tuned to process imaging data, transformer models form a more general type of neural network,” said study lead author Firas Khader, M.Sc., a Ph.D. student in the Department of Diagnostic and Interventional Radiology at University Hospital Aachen in Aachen, Germany. “They rely on a so-called attention mechanism, which allows the neural network to learn about relationships in its input.” 

This capability is ideal for medicine, where multiple variables like patient data and imaging findings are often integrated into the diagnosis. 

Khader and colleagues developed a transformer model tailored for medical use. They trained it on imaging and non-imaging patient data from two databases containing information from a combined total of more than 82,000 patients. 

The researchers trained the model to diagnose up to 25 conditions using non-imaging data, imaging data, or a combination of both, referred to as multimodal data. 

Compared to the other models, the multimodal model showed improved diagnostic performance for all conditions. 

The model has potential as an aid to clinicians in a time of growing workloads. 

“With patient data volumes increasing steadily over the years and time that the doctors can spend per patient being limited, it might become increasingly challenging for clinicians to interpret all available information effectively,” Khader said. “Multimodal models hold the promise to assist clinicians in their diagnosis by facilitating the aggregation of the available data into an accurate diagnosis.” 

The proposed model could serve as a blueprint for seamlessly integrating large data volumes, Khader said. 

For more information: www.rsna.org 


Related Content

News | Information Technology

April 25, 2024 — NewVue Inc., a leader in innovative cloud-native radiology workflow solutions, announced a strategic ...

Time April 25, 2024
arrow
News | PET Imaging

April 24, 2024 — A new study from Brigham and Women’s Hospital, a founding member of the Mass General Brigham healthcare ...

Time April 24, 2024
arrow
News | Radiology Business

April 23, 2024 — A diverse writing group—lead by authors at the University of Toronto—have developed an approach for ...

Time April 23, 2024
arrow
News | FDA

April 23, 2024 — Royal Philips , a global leader in health technology, today announced its Philips Zenition 30 mobile C ...

Time April 23, 2024
arrow
News | Ultrasound Imaging

April 22, 2024 — GE HealthCare announced the launch of the Voluson Signature 20 and 18 ultrasound systems, which ...

Time April 22, 2024
arrow
News | Artificial Intelligence

April 19, 2024 — Large language model GPT-4 matched the performance of radiologists in detecting errors in radiology ...

Time April 22, 2024
arrow
News | Computed Tomography (CT)

April 22, 2024 — A new study showed that a non-invasive imaging test can help identify patients with coronary artery ...

Time April 22, 2024
arrow
News | Lung Imaging

April 17, 2024 — A Medicare policy requiring primary care providers (PCPs) to share in the decision-making with patients ...

Time April 17, 2024
arrow
News | Radiology Business

April 17, 2024 — VISTA.AI announced the appointment of Daniel Hawkins as President and CEO. The company is pioneering AI ...

Time April 17, 2024
arrow
News | Magnetic Resonance Imaging (MRI)

April 17, 2024 — Hyperfine, Inc., a groundbreaking health technology company that has redefined brain imaging with the ...

Time April 17, 2024
arrow
Subscribe Now