News | Digital Pathology | January 10, 2019

AI Approach Outperformed Human Experts in Identifying Cervical Precancer

Artificial intelligence algorithm could revolutionize cervical cancer screening, especially in low-resource settings

AI Approach Outperformed Human Experts in Identifying Cervical Precancer

January 10, 2019 — A research team led by investigators from the National Institutes of Health and Global Good has developed a computer algorithm that can analyze digital images of a woman’s cervix and accurately identify precancerous changes that require medical attention. This artificial intelligence (AI) approach, called automated visual evaluation, has the potential to revolutionize cervical cancer screening, particularly in low-resource settings.

To develop the method, researchers used comprehensive datasets to "train" a deep, or machine, learning algorithm to recognize patterns in complex visual inputs, such as medical images. The approach was created collaboratively by investigators at the National Cancer Institute (NCI) and Global Good, a fund at Intellectual Ventures, and the findings were confirmed independently by experts at the National Library of Medicine (NLM). The results appeared in the Journal of the National Cancer Institute on Jan. 10, 2019.1 NCI and NLM are parts of NIH.

"Our findings show that a deep learning algorithm can use images collected during routine cervical cancer screening to identify precancerous changes that, if left untreated, may develop into cancer," said Mark Schiffman, M.D., MPH, of NCI’s Division of Cancer Epidemiology and Genetics, and senior author of the study. "In fact, the computer analysis of the images was better at identifying precancer than a human expert reviewer of Pap tests under the microscope (cytology)."

The new method has the potential to be of particular value in low-resource settings. Healthcare workers in such settings currently use a screening method called visual inspection with acetic acid (VIA). In this approach, a health worker applies dilute acetic acid to the cervix and inspects the cervix with the naked eye, looking for "aceto whitening," which indicates possible disease. Because of its convenience and low cost, VIA is widely used where more advanced screening methods are not available. However, it is known to be inaccurate and needs improvement.

Automated visual evaluation is similarly easy to perform. Health workers can use a cell phone or similar camera device for cervical screening and treatment during a single visit. In addition, this approach can be performed with minimal training, making it ideal for countries with limited healthcare resources, where cervical cancer is a leading cause of illness and death among women.

To create the algorithm, the research team used more than 60,000 cervical images from an NCI archive of photos collected during a cervical cancer screening study that was carried out in Costa Rica in the 1990s. More than 9,400 women participated in that population study, with follow-up that lasted up to 18 years. Because of the prospective nature of the study, the researchers gained nearly complete information on which cervical changes became precancers and which did not. The photos were digitized and then used to train a deep learning algorithm so that it could distinguish cervical conditions requiring treatment from those not requiring treatment.

"When this algorithm is combined with advances in HPV vaccination, emerging HPV detection technologies, and improvements in treatment, it is conceivable that cervical cancer could be brought under control, even in low-resource settings," said Maurizio Vecchione, executive vice president of Global Good.

The researchers plan to further train the algorithm on a sample of representative images of cervical precancers and normal cervical tissue from women in communities around the world, using a variety of cameras and other imaging options. This step is necessary because of subtle variations in the appearance of the cervix among women in different geographic regions. The ultimate goal of the project is to create the best possible algorithm for common, open use.

For more information: www.academic.oup.com/jnci

Reference

1. Hu L., Bell D., Antani S., et al. An Observational Study of Deep Learning and Automated Evaluation of Cervical Images for Cancer Screening. Journal of the National Cancer Institute, Jan. 10, 2019. https://doi.org/10.1093/jnci/djy225

Related Content

The RIS market is expected to reach $979.1 M by 2025

Image courtesy of Agfa

News | Information Technology | December 13, 2019
December 13, 2019 — According to a new study released by Rese...
Schematic diagram of the proposed multichannel deep neural network model analyzing multiscale functional brain connectome for a classification task. rsfMRI = resting-state functional MRI.

Schematic diagram of the proposed multichannel deep neural network model analyzing multiscale functional brain connectome for a classification task. rsfMRI = resting-state functional MRI. Graphic courtesy of the Radiological Society of North America.

News | Artificial Intelligence | December 11, 2019
December 11, 2019 — Deep learning, a type of arti...
EMR patient portal on a smartphone
News | Electronic Medical Records (EMR) | December 11, 2019
December 11, 2019 — Despite the numerous benefits associated with patients accessing their medical records, a new stu
#RSNA19 A sophisticated type of artificial intelligence (AI) can detect clinically meaningful chest X-ray findings as effectively as experienced radiologists, according to a study published in the journal Radiology.

Image courtesy of GE Healthcare

News | Artificial Intelligence | December 03, 2019
December 3, 2019 — A sophisticated type of...
REiLI AI platform auto segmentation.
News | Artificial Intelligence | November 30, 2019
December 1, 2019 — Fujifilm Medical Systems U.S.A.
#RSNA19 Dynamics enables a radiologist to compare X-rays and provide automatically generated reports specifically addressing the changes in images over the course of patient treatment. Initially Dynamics feature will support longitudinal comparison for pneumothorax, consolidation, mass, nodule, pleural effusion, pulmonary edema and lung congestion radiological findings where progress reports are of greatest importance
News | Artificial Intelligence | November 29, 2019
November 29, 2019 — At RSNA19 Oxipit will offer a first public p
XACT Robotics is advancing the field of radiology, pioneering the first hands-free robotic system, combining image-based planning and navigation with instrument insertion and steering capabilities
News | Artificial Intelligence | November 26, 2019
November 26, 2019 — XACT Robotics Ltd.