News | Artificial Intelligence | July 29, 2019

New AI Tool Identifies Cancer Outcomes Using Radiology Reports

Artificial intelligence tool developed at Dana-Farber Cancer Institute uses natural language processing to rapidly assess unstructured data

New AI Tool Identifies Cancer Outcomes Using Radiology Reports

July 29, 2019 — Scientists at Dana-Farber Cancer Institute have demonstrated that an artificial intelligence (AI) tool can perform as well as human reviewers – and much more rapidly - in extracting clinical information regarding changes in tumors from unstructured radiology reports for patients with lung cancer.

The AI tool performed comparably to trained human “curators” in detecting the presence of cancer; and whether it was responding to treatment interventions, stable or worsening.

The goal of the study, said corresponding author Kenneth Kehl, M.D., MPH, a medical oncologist and faculty member of the Population Sciences Department at Dana-Faber, was to determine whether AI tools can extract the most high-value cancer outcomes from radiology reports, which are a ubiquitous but unstructured data source.

Kehl noted that electronic health records (EHRs) now collect vast amounts of information on thousands of patients seen at a center like Dana-Farber. However, unless the patients are enrolled in clinical trials, information about their outcomes, such as whether their cancers grow or shrink in response to treatment, is recorded only in the text of the medical record. Historically, this unstructured information is not amenable to computational analysis and therefore could not be used for research into the effectiveness of treatment.

Because of studies like the Profile initiative at Dana-Farber/Brigham and Women’s Cancer Center, which analyzes patient tumor samples and creates profiles that reveal genomic variants that may predict responsiveness to treatments, Dana-Farber researchers have accumulated a wealth of molecular information about patients’ cancers. “But it can be difficult to apply this information to understand what molecular patterns predict benefit from treatments without intensive review of patients’ medical records to measure their outcomes. This is a critical barrier to realizing the full potential of precision medicine,” said Kehl.

For the current study, Kehl and colleagues obtained more than 14,000 imaging reports for 1,112 patients and manually reviewed records using the “PRISSMM” framework. PRISSMM is a phenomic data standard developed at Dana-Farber that takes unstructured data from text reports in EHRs and structures them so that they can be readily analyzed. PRISSMM structures data pertaining to a patient’s pathology, radiology/imaging, signs/symptoms, molecular markers and a medical oncologist’s assessment to create a portrait of the cancer patient journey.

Human reviewers analyzed the imaging text reports and noted whether cancer was present and, if so, whether it was worsening or improving, and if the cancer had spread to specific body sites. These reports were then used to train a computational deep learning model to recognize these outcomes from the text reports. “Our hypothesis was that deep learning algorithms could use routinely generated radiology text reports to identify the presence of cancer and changes in its extent over time,” the authors wrote.

The researchers compared human and computer measurements of outcomes such as disease-free survival, progression-free survival, and time to improvement or response, and found that the AI algorithm could replicate human assessment of these outcomes. The deep learning algorithms were then applied to annotate another 15,000 reports for 1,294 patients whose records had not been manually reviewed. The authors found that computer outcome measurements among these patients predicted survival with similar accuracy to human assessments among the manually reviewed patients.

The human curators were able to annotate imaging reports for about three patients per hour, a rate at which one curator would need about six months to annotate all of the nearly 30,000 imaging reports for the patients in the cohort. By contrast, the artificial intelligence model that the researchers developed could annotate the imaging reports for the cohort in about 10 minutes, the researchers said in a report in JAMA Oncology.

“To create a true learning health system for oncology and to facilitate delivery of precision medicine at scale, methods are needed to accelerate curation of cancer-related outcomes from electronic health records,” said the authors of the publication. If applied widely, the investigators said, “this technique could substantially accelerate efforts to use real-world data from all patients with cancer to generate evidence regarding effectiveness of treatment approaches.” Next steps will include testing this approach on EHR data from other cancer centers and using the data to discover which treatments work best for which patients.

The senior author of study is Deborah Schrag, M.D., MPH, chief of Division of Population Sciences at Dana-Farber and a medical oncologist.

For more information: www.jamanetwork.com/journals/jamaoncology

 

Reference

1. Kehl K.L., Elmarakeby H., Nishino M., et al. Assessment of Deep Natural Language Processing in Ascertaining Oncologic Outcomes From Radiology Reports. JAMA Oncology, published online July 25, 2019. doi:10.1001/jamaoncol.2019.1800

Related Content

Developed by medical AI company Lunit, Software detects breast cancer with 97% accuracy; Study in Lancet Digital Health shows that Lunit INSIGHT MMG-aided radiologists showed an increase in sensitivity

Lunit INSIGHT MMG

News | Artificial Intelligence | June 02, 2020
June 2, 2020 — Lunit announced that its artificia...
AIR Recon DL delivers shorter scans and better image quality (Photo: Business Wire)

AIR Recon DL delivers shorter scans and better image quality (Photo: Business Wire).

News | Artificial Intelligence | May 29, 2020
May 29, 2020 — GE Healthcare announced U.S.
The paradox is that COVID-19 has manifested the critical need for exactly what the rules require: advancement of interoperability and digital online access to clinical data and imaging, at scale, for care coordination and infection control.

The paradox is that COVID-19 has manifested the critical need for exactly what the rules require: advancement of interoperability and digital online access to clinical data and imaging, at scale, for care coordination and infection control. Getty Images

Feature | Coronavirus (COVID-19) | May 28, 2020 | By Matthew A. Michela
One year after being proposed, federal rules to advance interoperability in healthcare and create easier access for p
The opportunity to converge the silos of data into a cross-functional analysis can provide immense value during the COVID-19 outbreak and in the future

Getty Images

Feature | Coronavirus (COVID-19) | May 28, 2020 | By Jeff Vachon
In the midst of the coronavirus pandemic normal
AI has the potential to help radiologists improve the efficiency and effectiveness of breast cancer imaging

Getty Images

Feature | Breast Imaging | May 28, 2020 | By January Lopez, M.D.
Headlines around the world the past several months declared that...
In April, the U.S. Food and Drug Administration (FDA) cleared Intelerad’s InteleConnect EV solution for diagnostic image review on a range of mobile devices.
Feature | PACS | May 27, 2020 | By Melinda Taschetta-Millane
Fast, easily accessible patient images are crucial in this day and age, as imaging and medical records take on a new
The Philips Lumify point-of-care ultrasound (POCUS) system assessing a patient in the emergency room combined with telehealth to enable real-time collaboration with other physicians.

The Philips Lumify point-of-care ultrasound (POCUS) system assessing a patient in the emergency room combined with telehealth to enable real-time collaboration with other physicians.

News | Coronavirus (COVID-19) | May 26, 2020
May 26, 2020  — Philips Healthcare recently received 510(k) clearance from the U.S.
An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019.

An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019. Photo by Dave Fornell.

News | Ultrasound Imaging | May 26, 2020
May 12, 2020 — DiA Imaging Analysis, a provider of AI based ultrasound analysis solutions, said it received a governm
 Recently the versatility of mixed and augmented reality products has come to the forefront of the news, with an Imperial led project at the Imperial College Healthcare NHS Trust. Doctors have been wearing the Microsoft Hololens headsets whilst working on the front lines of the COVID pandemic, to aid them in their care for their patients. IDTechEx have previously researched this market area in its report “Augmented, Mixed and Virtual Reality 2020-2030: Forecasts, Markets and Technologies”, which predicts th

Doctors wearing the Hololens Device. Source: Imperial.ac.uk

News | Artificial Intelligence | May 22, 2020
May 22, 2020 — Recently the versatility of