News | Artificial Intelligence | December 27, 2019

Artificial Intelligence Identifies Previously Unknown Features Associated with Cancer Recurrence

The AI also identified features relevant to cancer prognosis that were not previously noted by pathologists, leading to a higher accuracy of prostate cancer recurrence compared to pathologist-based diagnosis

The AIP's RAIDEN AI supercomputer

The AIP's RAIDEN AI supercomputer

December 27, 2019 — Artificial intelligence (AI) technology developed by the RIKEN Center for Advanced Intelligence Project (AIP) in Japan has successfully found features in pathology images from human cancer patients, without annotation, that could be understood by human doctors. Further, the AI identified features relevant to cancer prognosis that were not previously noted by pathologists, leading to a higher accuracy of prostate cancer recurrence compared to pathologist-based diagnosis. Combining the predictions made by the AI with predictions by human pathologists led to an even greater accuracy.

According to Yoichiro Yamamoto, M.D., Ph.D., the first author of the study published in Nature Communications, "This technology could contribute to personalized medicine by making highly accurate prediction of cancer recurrence possible by acquiring new knowledge from images. It could also contribute to understanding how AI can be used safely in medicine by helping to resolve the issue of AI being seen as a 'black box.'"

The research group led by Yamamoto and Go Kimura, in collaboration with a number of university hospitals in Japan, adopted an approach called "unsupervised learning." As long as humans teach the AI, it is not possible to acquire knowledge beyond what is currently known. Rather than being "taught" medical knowledge, the AI was asked to learn using unsupervised deep neural networks, known as autoencoders, without being given any medical knowledge. The researchers developed a method for translating the features found by the AI — only numbers initially — into high-resolution images that can be understood by humans.

To perform this feat the group acquired 13,188 whole-mount pathology slide images of the prostate from Nippon Medical School Hospital (NMSH), The amount of data was enormous, equivalent to approximately 86 billion image patches (sub-images divided for deep neural networks), and the computation was performed on AIP's powerful RAIDEN supercomputer.

The AI learned using pathology images without diagnostic annotation from 11 million image patches. Features found by AI included cancer diagnostic criteria that have been used worldwide, on the Gleason score, but also features involving the stroma — connective tissues supporting an organ — in non-cancer areas that experts were not aware of. In order to evaluate these AI-found features, the research group verified the performance of recurrence prediction using the remaining cases from NMSH (internal validation). The group found that the features discovered by the AI were more accurate (AUC=0.820) than predictions made based on the human-established cancer criteria developed by pathologists, the Gleason score (AUC=0.744). Furthermore, combining both AI-found features and the human-established criteria predicted the recurrence more accurately than using either method alone (AUC=0.842). The group confirmed the results using another dataset including 2,276 whole-mount pathology images (10 billion image patches) from St. Marianna University Hospital and Aichi Medical University Hospital (external validation).

"I was very happy," said Yamamoto, "to discover that the AI was able to identify cancer on its own from unannotated pathology images. I was extremely surprised to see that AI found features that can be used to predict recurrence that pathologists had not identified."

He continued, "We have shown that AI can automatically acquire human-understandable knowledge from diagnostic annotation-free histopathology images. This 'newborn' knowledge could be useful for patients by allowing highly-accurate predictions of cancer recurrence. What is very nice is that we found that combining the AI's predictions with those of a pathologist increased the accuracy even further, showing that AI can be used hand-in-hand with doctors to improve medical care. In addition, the AI can be used as a tool to discover characteristics of diseases that have not been noted so far, and since it does not require human knowledge, it could be used in other fields outside medicine."

For more information: www.riken.jp/en/research/labs/aip/

Related Content

#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2

Getty Images

Feature | Coronavirus (COVID-19) | April 07, 2020 | By Melinda Taschetta-Millane and Dave Fornell
In an effort to keep the imaging field updated on the latest information being released on coronavirus (COVID-19), th
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2 Professor David Sebag-Montefiore (Image courtesy of the University of Leeds)

Professor David Sebag-Montefiore (Image courtesy of the University of Leeds)

News | Radiation Therapy | April 07, 2020
April 7, 2020 — An intern...
A recent study earlier this year in the journal Nature, which included researchers from Google Health London, demonstrated that artificial intelligence (AI) technology outperformed radiologists in diagnosing breast cancer on mammograms
Feature | Breast Imaging | April 06, 2020 | By Samir Parikh
A recent study earlier this year in the journal Nature,
Eclipse v16 has received CE mark and is 510(k) pending
News | Proton Therapy | April 06, 2020
April 6, 2020 — Driven by its Intelligent Cancer Care approach in developing new solutions that use advanced technolo
Varian received FDA clearance for its Ethos therapy in February 2020. It is an adaptive intelligence solution that uses onboard AI in the treatment system to take the cone beam CT imaging on the system, compare it to the treatment plan and deliver an entire adaptive treatment plan in a typical 15-minute treatment time slot, from patient setup through treatment delivery.

Varian received FDA clearance for its Ethos therapy in February 2020, shown here displayed for the first time at ASTRO 2019. It is an adaptive intelligence solution that uses onboard AI in the treatment system to take the cone beam CT imaging on the system, compare it to the treatment plan and deliver an entire adaptive treatment plan in a typical 15-minute treatment time slot, from patient setup through treatment delivery.

Feature | Treatment Planning | April 03, 2020 | Dave Fornell, Editor
The traditional treatment planning process takes days to create an optimized radiation therapy delivery plan, but new
An example of Philips’ TrueVue technology, which offers photo-realistic rendering and the ability to change the location of the lighting source on 3-D ultrasound images. In this example of two Amplazer transcatheter septal occluder devices in the heart, the operator demonstrating the product was able to push the lighting source behind the devices into the other chamber of the heart. This illuminated a hole that was still present that the occluders did not seal.

An example of Philips’ TrueVue technology, which offers photo-realistic rendering and the ability to change the location of the lighting source on 3-D ultrasound images. In this example of two Amplazer transcatheter septal occluder devices in the heart, the operator demonstrating the product was able to push the lighting source behind the devices into the other chamber of the heart. This illuminated a hole that was still present that the occluders did not seal. Photo by Dave Fornell

Feature | Radiology Imaging | April 02, 2020 | By Katie Caron
A new year — and decade — offers the opportunity to reflect on the advancements and challenges of years gone by and p
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus

Getty Images

Feature | Coronavirus (COVID-19) | April 02, 2020 | Jilan Liu and HIMSS Greater China Team
Information technologies have played a pivotal role in China’s response to the novel coronavirus...
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2 the company is now offering a suite of AI solutions Vuno Med-LungQuant and Vuno Med-Chest X-ray for COVID-19, encompassing both lung X-ray and computed tomography (CT) modalities respectively all at once
News | Artificial Intelligence | April 02, 2020
April 2, 2020 — In the face of the COVID-19 pand
#COVID19 #Coronavirus #2019nCoV #Wuhanvirus #SARScov2 New studies use SIRD model to forecast COVID-19 spread; examine patient CT scans to correlate clinical features with mortality

Fig 1. A sample scoring on CT images of a 63-year-old woman from mortality group demonstrated a total score of 63. It was calculated as: for upper zone (A), 3 (consolidation) × 3 (50–75% distribution) × 2 (both right and left lungs) + 2 (ground glass opacity) ×1 (< 25% distribution) × 2 (both right and left lungs); for middle zone (B), 3 (consolidation) × 2 (25–50% distribution) × 2 (both right and left lungs) + 2 (ground glass opacity) × 2 (25–50% distribution) × 2 (both right and left lungs); for lower zone (C), 3 (consolidation) × (2 (25–50% distribution of the right lung) + 3 (50–75% distribution of the left lung)) + 2 (ground glass opacity) × (2 (25–50% distribution of the right lung) + 1 (< 25% distribution of the left lung)) Yuan et al, 2020 (CC BY 4.0)

News | Coronavirus (COVID-19) | April 01, 2020
April 1, 2020 — A new study, ...