News | Artificial Intelligence | August 13, 2019

Artificial Intelligence Could Yield More Accurate Breast Cancer Diagnoses

System developed at UCLA can interpret images that are challenging for doctors to classify

Artificial Intelligence Could Yield More Accurate Breast Cancer Diagnoses

August 13, 2019 — University of California Los Angeles (UCLA) researchers have developed an artificial intelligence (AI) system that could help pathologists read biopsies more accurately and to better detect and diagnose breast cancer.

The new system, described in a study that will be published in JAMA Network Open, helps interpret medical images used to diagnose breast cancer that can be difficult for the human eye to classify, and it does so nearly as accurately or better as experienced pathologists.1

“It is critical to get a correct diagnosis from the beginning so that we can guide patients to the most effective treatments,” said Joann Elmore, M.D., MPH, the study’s senior author and a professor of medicine at the David Geffen School of Medicine at UCLA.

A 2015 study led by Elmore found that pathologists often disagree on the interpretation of breast biopsies, which are performed on millions of women each year.2 That earlier research revealed that diagnostic errors occurred in about one out of every six women who had ductal carcinoma in situ (a noninvasive type of breast cancer), and that incorrect diagnoses were given in about half of the biopsy cases of breast atypia (abnormal cells that are associated with a higher risk for breast cancer).

“Medical images of breast biopsies contain a great deal of complex data and interpreting them can be very subjective,” said Elmore, who is also a researcher at the UCLA Jonsson Comprehensive Cancer Center. “Distinguishing breast atypia from ductal carcinoma in situ is important clinically but very challenging for pathologists. Sometimes, doctors do not even agree with their previous diagnosis when they are shown the same case a year later.”

The scientists reasoned that artificial intelligence could provide more accurate readings consistently because by drawing from a large data set, the system can recognize patterns in the samples that are associated with cancer but are difficult for humans to see.

The team fed 240 breast biopsy images into a computer, training it to recognize patterns associated with several types of breast lesions, ranging from benign (noncancerous) and atypia to ductal carcinoma in situ (DCIS) and invasive breast cancer. Separately, the correct diagnoses for each image were determined by a consensus among three expert pathologists.

To test the system, the researchers compared its readings to independent diagnoses made by 87 practicing U.S. pathologists. While the artificial intelligence program came close to performing as well as human doctors in differentiating cancer from non-cancer cases, the AI program outperformed doctors when differentiating DCIS from atypia — considered the greatest challenge in breast cancer diagnosis. The system correctly determined whether scans showed DCIS or atypia more often than the doctors; it had a sensitivity between 0.88 and 0.89, while the pathologists’ average sensitivity was 0.70. (A higher sensitivity score indicates a greater likelihood that a diagnosis and classification is correct.)

“These results are very encouraging,” Elmore said. “There is low accuracy among practicing pathologists in the U.S. when it comes to the diagnosis of atypia and ductal carcinoma in situ, and the computer-based automated approach shows great promise.”

The researchers are now working on training the system to diagnose melanoma.

For more information: www.jamanetwork.com/journals/jamanetworkopen

Related Digital Pathology Content

VIDEO: Integrating Digital Pathology With Radiology

References

1. Mercan E., Mehta S., Bartlett J., et al. Assessment of Machine Learning of Breast Pathology Structures for Automated Differentiation of Breast Cancer and High-Risk Proliferative Lesions. JAMA Network Open, Aug. 9, 2019. doi:10.1001/jamanetworkopen.2019.8777

2. Elmore J.G., Longton G.M., Carney P.A., et al. Diagnostic Concordance Among Pathologists Interpreting Breast Biopsy Specimens. JAMA Network Open, March 17, 2015. doi:10.1001/jama.2015.1405

Related Content

#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2

Getty Images

Feature | Coronavirus (COVID-19) | April 07, 2020 | By Melinda Taschetta-Millane and Dave Fornell
In an effort to keep the imaging field updated on the latest information being released on coronavirus (COVID-19), th
A recent study earlier this year in the journal Nature, which included researchers from Google Health London, demonstrated that artificial intelligence (AI) technology outperformed radiologists in diagnosing breast cancer on mammograms
Feature | Breast Imaging | April 06, 2020 | By Samir Parikh
A recent study earlier this year in the journal Nature,
Varian received FDA clearance for its Ethos therapy in February 2020. It is an adaptive intelligence solution that uses onboard AI in the treatment system to take the cone beam CT imaging on the system, compare it to the treatment plan and deliver an entire adaptive treatment plan in a typical 15-minute treatment time slot, from patient setup through treatment delivery.

Varian received FDA clearance for its Ethos therapy in February 2020, shown here displayed for the first time at ASTRO 2019. It is an adaptive intelligence solution that uses onboard AI in the treatment system to take the cone beam CT imaging on the system, compare it to the treatment plan and deliver an entire adaptive treatment plan in a typical 15-minute treatment time slot, from patient setup through treatment delivery.

Feature | Treatment Planning | April 03, 2020 | Dave Fornell, Editor
The traditional treatment planning process takes days to create an optimized radiation therapy delivery plan, but new
Feature | Breast Density | April 03, 2020 | By Dayna Williams M.D., Shivani Chaudhry, M.D., and Laurie R. Margolies, M.D.
Breast cancer is the most common cance
An example of Philips’ TrueVue technology, which offers photo-realistic rendering and the ability to change the location of the lighting source on 3-D ultrasound images. In this example of two Amplazer transcatheter septal occluder devices in the heart, the operator demonstrating the product was able to push the lighting source behind the devices into the other chamber of the heart. This illuminated a hole that was still present that the occluders did not seal.

An example of Philips’ TrueVue technology, which offers photo-realistic rendering and the ability to change the location of the lighting source on 3-D ultrasound images. In this example of two Amplazer transcatheter septal occluder devices in the heart, the operator demonstrating the product was able to push the lighting source behind the devices into the other chamber of the heart. This illuminated a hole that was still present that the occluders did not seal. Photo by Dave Fornell

Feature | Radiology Imaging | April 02, 2020 | By Katie Caron
A new year — and decade — offers the opportunity to reflect on the advancements and challenges of years gone by and p
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus

Getty Images

Feature | Coronavirus (COVID-19) | April 02, 2020 | Jilan Liu and HIMSS Greater China Team
Information technologies have played a pivotal role in China’s response to the novel coronavirus...
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2 the company is now offering a suite of AI solutions Vuno Med-LungQuant and Vuno Med-Chest X-ray for COVID-19, encompassing both lung X-ray and computed tomography (CT) modalities respectively all at once
News | Artificial Intelligence | April 02, 2020
April 2, 2020 — In the face of the COVID-19 pand
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2 New studies use SIRD model to forecast COVID-19 spread; examine patient CT scans to correlate clinical features with mortality

Fig 1. A sample scoring on CT images of a 63-year-old woman from mortality group demonstrated a total score of 63. It was calculated as: for upper zone (A), 3 (consolidation) × 3 (50–75% distribution) × 2 (both right and left lungs) + 2 (ground glass opacity) ×1 (< 25% distribution) × 2 (both right and left lungs); for middle zone (B), 3 (consolidation) × 2 (25–50% distribution) × 2 (both right and left lungs) + 2 (ground glass opacity) × 2 (25–50% distribution) × 2 (both right and left lungs); for lower zone (C), 3 (consolidation) × (2 (25–50% distribution of the right lung) + 3 (50–75% distribution of the left lung)) + 2 (ground glass opacity) × (2 (25–50% distribution of the right lung) + 1 (< 25% distribution of the left lung)) Yuan et al, 2020 (CC BY 4.0)

News | Coronavirus (COVID-19) | April 01, 2020
April 1, 2020 — A new study, ...