Feature | Society of Breast Imaging (SBI) | September 06, 2019 | By Greg Freiherr

AI Algorithm Detects Breast Cancer in MR Images

Presentation at breast imaging symposium demonstrates potential of deep learning

A smart algorithm has been trained on a neural network to recognize the appearance of breast cancer in MR images

A smart algorithm has been trained on a neural network to recognize the appearance of breast cancer in MR images. The algorithm, described at the SBI/ACR Breast Imaging Symposium, used deep learning, a form of machine learning, which is a type of artificial intelligence. Image courtesy of Sarah Eskreis-Winkler, M.D.

Greg Freiherr

Greg Freiherr

The use of smart algorithms has the potential to make healthcare more efficient. Sarah Eskreis-Winkler, M.D., presented data that such an algorithm — trained using deep learning (DL), a type of artificial intelligence (AI) — can reliably identify breast tumors in magnetic resonance (MR) images. In doing so, the algorithm has the potential to make radiology more efficient.

On April 4, at the Society for Breast Imaging (SBI)/American College of Radiology (ACR) Breast Imaging Symposium, Eskreis-Winkler stated that the algorithm, which was trained to identify tumors in breast MR images, could save time without compromising accuracy. Deep learning, she explained in her talk, is a subset of machine learning, which is part of artificial intelligence.

Deep learning is a new powerful technology that has the potential to help us with a wide range of imaging tasks,” said Eskreis-Winkler, radiology resident from Weill Cornell Medicine/New York-Presbyterian Hospital. In her talk at the SBI symposium, she said DL has been “shown to meet and in some cases exceed human-level performance.”

How The DL Algorithm Was Developed

Eskreis-Winkler and her colleagues used a neural network to classify segments of the MR image and to extract features. The algorithm learned to do this on its own. The use of DL eliminated the need to explicitly tell the computer exactly what to look for, she said during the presentation: “We just feed the entire image into the neural network, and the computer figures out which parts are important all by itself.”

Eskreis-Winkler, who is working toward a doctorate in MRI physics “interspersed with the residency,” outlined the development of a deep learning tool for clinical use. Initially, many batches of labeled images are fed into the neural network. When training begins, the network weights, which are used to make decisions, are randomly initialized. “So network accuracy is about as good as a coin toss,” she said.

The network, however, learns from its mistakes using a process called backpropagation, whereby wrongly categorized image results are fed backwards through the network and the decision weights are adjusted. “So the next time the network is fed a similar case, it has learned from its mistake and it gets the answer right,” said Eskreis-Winkler, who plans to be a breast imaging fellow at Memorial Sloan Kettering Cancer Center (MSKCC) after completing her Ph.D. and residency in June 2019. Work on the project was done at MSKCC, she said, with Harini Veeraraghavan, Natsuko Onishi, Shreena Shah, Meredith Sadinski, Danny Martinez, Yi Wang, Elizabeth Morris and Elizabeth Sutton.

After her symposium talk, Eskreis-Winkler told Imaging Technology News that, if integrated into the clinical workflow, the algorithm has the potential to improve the efficiency of the radiologist, “so that the tumor pops up when you open a case on PACS.“ Its use might also save time during tumor boards, she said, by automatically scrolling to breast MRI slices that show cancer lesions. This would eliminate the time otherwise spent manually scrolling to these slices.

DL Algorithm Scores in the ‘90s

The algorithm that she described at the SBI symposium processed MR images from 277 women, classifying segments within these images as either showing or not showing tumor. The algorithm achieved an accuracy of 93 percent on a test set. Sensitivity and specificity for tumor detection were 94 percent and 92 percent, respectively.

She described the results as “promising, because the dataset size we were using — about 6,000 slices — wasn’t even so big by deep learning standards. Going forward we should be able to improve our results by increasing the size of our dataset.”

DL works best when using at least 20,000 slices, Eskreis-Winkler said.

Deep learning will not provide the whole solution, she cautioned. People have to work with DL algorithms to achieve their potential.

“The way in which AI tools will be integrated into our daily practice is still uncertain,” she said in her SBI presentation. “So there is a big opportunity for us to be creative and to be proactive, to come up with ways to harness the power of AI to make us better radiologists and to better serve our patients.”

Machines make diagnostic errors, as do radiologists, Eskreis-Winkler asserted. “But they don’t make the same kinds of errors,” she told ITN. “So one of the really exciting areas is to figure out how to best combine the power of humans and machines, to push our diagnostic performance to new heights. This is an initial step in that direction.” 

Greg Freiherr is a contributing editor to Imaging Technology News (ITN). Over the past three decades, he has served as business and technology editor for publications in medical imaging, as well as consulted for vendors, professional organizations, academia and financial institutions.

Related content:

Is Artificial Intelligence The Doom of Radiology?

FDA Proposes New Review Framework for AI-based Medical Devices

Video: Technology Report: Artificial Intelligence

Related Content

Developed by medical AI company Lunit, Software detects breast cancer with 97% accuracy; Study in Lancet Digital Health shows that Lunit INSIGHT MMG-aided radiologists showed an increase in sensitivity

Lunit INSIGHT MMG

News | Artificial Intelligence | June 02, 2020
June 2, 2020 — Lunit announced that its artificia...
AIR Recon DL delivers shorter scans and better image quality (Photo: Business Wire)

AIR Recon DL delivers shorter scans and better image quality (Photo: Business Wire).

News | Artificial Intelligence | May 29, 2020
May 29, 2020 — GE Healthcare announced U.S.
The paradox is that COVID-19 has manifested the critical need for exactly what the rules require: advancement of interoperability and digital online access to clinical data and imaging, at scale, for care coordination and infection control.

The paradox is that COVID-19 has manifested the critical need for exactly what the rules require: advancement of interoperability and digital online access to clinical data and imaging, at scale, for care coordination and infection control. Getty Images

Feature | Coronavirus (COVID-19) | May 28, 2020 | By Matthew A. Michela
One year after being proposed, federal rules to advance interoperability in healthcare and create easier access for p
The opportunity to converge the silos of data into a cross-functional analysis can provide immense value during the COVID-19 outbreak and in the future

Getty Images

Feature | Coronavirus (COVID-19) | May 28, 2020 | By Jeff Vachon
In the midst of the coronavirus pandemic normal
AI has the potential to help radiologists improve the efficiency and effectiveness of breast cancer imaging

Getty Images

Feature | Breast Imaging | May 28, 2020 | By January Lopez, M.D.
Headlines around the world the past several months declared that...
In April, the U.S. Food and Drug Administration (FDA) cleared Intelerad’s InteleConnect EV solution for diagnostic image review on a range of mobile devices.
Feature | PACS | May 27, 2020 | By Melinda Taschetta-Millane
Fast, easily accessible patient images are crucial in this day and age, as imaging and medical records take on a new
An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019.

An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019. Photo by Dave Fornell.

News | Ultrasound Imaging | May 26, 2020
May 12, 2020 — DiA Imaging Analysis, a provider of AI based ultrasound analysis solutions, said it received a governm
a Schematic of the system. The entire solid tumour is illuminated from four sides by a four-arm fibre bundle. A cylindrically focused linear array is designed to detect optoacoustic signals from the tumour. In vivo imaging is performed in conical scanning geometry by controlling the rotation and translation stages. The sensing part of the transducer array and the tumour are submerged in water to provide acoustic coupling. b Maximum intensity projections of the optoacoustic reconstruction of a phantom of pol

a Schematic of the system. The entire solid tumour is illuminated from four sides by a four-arm fibre bundle. A cylindrically focused linear array is designed to detect optoacoustic signals from the tumour. In vivo imaging is performed in conical scanning geometry by controlling the rotation and translation stages. The sensing part of the transducer array and the tumour are submerged in water to provide acoustic coupling. b Maximum intensity projections of the optoacoustic reconstruction of a phantom of polyethylene microspheres (diameter, 20 μm) dispersed in agar. The inset shows a zoomed-in view of the region boxed with a yellow dashed line. In addition, the yellow boxes are signal profiles along the xy and z axes across the microsphere centre, as well as the corresponding full width at half-maximum values. c Normalized absorption spectra of Hb, HbO2 and gold nanoparticles (AuNPs). The spectrum for the AuNPs was obtained using a USB4000 spectrometer (Ocean Optics, Dunedin, FL, USA), while the spectra for Hb and HbO2 were taken from http://omlc.org/spectra/haemoglobin/index.html. The vertical dashed lines indicate the five wavelengths used to stimulate the three absorbers: 710, 750, 780, 810 and 850 nm. Optoacoustic signals were filtered into a low-frequency band (red) and high-frequency band (green), which were used to reconstruct separate images.

News | Breast Imaging | May 26, 2020
May 26, 2020 — Breast cancer is the most common cancer in women.
 Recently the versatility of mixed and augmented reality products has come to the forefront of the news, with an Imperial led project at the Imperial College Healthcare NHS Trust. Doctors have been wearing the Microsoft Hololens headsets whilst working on the front lines of the COVID pandemic, to aid them in their care for their patients. IDTechEx have previously researched this market area in its report “Augmented, Mixed and Virtual Reality 2020-2030: Forecasts, Markets and Technologies”, which predicts th

Doctors wearing the Hololens Device. Source: Imperial.ac.uk

News | Artificial Intelligence | May 22, 2020
May 22, 2020 — Recently the versatility of
Phone call and linkage-to-care-based intervention increases mammography uptake among primary care patients at an urban safety-net hospital

Getty Images

News | Mammography | May 22, 2020
May 22, 2020 — Telephone outreach coupled with scheduling assistance significantly increased...