News | Ultrasound Imaging | June 12, 2018 | Tony Kontzer

NVIDIA GPU Cloud containers speed research into using affordable ultrasound technology more widely

How AI and Deep Learning Will Enable Cancer Diagnosis Via Ultrasound

The red outline shows the manually segmented boundary of a carcinoma, while the deep learning-predicted boundaries are shown in blue, green and cyan. Copyright 2018 Kumar et al. under Creative Commons Attribution License.


June 12, 2018 — Viksit Kumar didn’t know his mother had ovarian cancer until it had reached its third stage, too late for chemotherapy to be effective. She died in a hospital in Mumbai, India, in 2006, but might have lived years longer if her cancer had been detected earlier. This knowledge ate at the mechanical engineering student, spurring him to choose a different path.

“That was one of the driving factors for me to move into the medical field,” said Kumar, now a senior research fellow at the Mayo Clinic, in Rochester, Minn. He hopes that the work his mom’s death inspired will help others to avoid her fate.

For the past few years, Kumar has been leading an effort to use GPU-powered deep learning to more accurately diagnose cancers sooner using ultrasound images. The work has focused on breast cancer (which is much more prevalent than ovarian cancer and attracts more funding), with the primary aim of enabling earlier diagnoses in developing countries, where mammograms are rare.

Into the Deep End of Deep Learning

Kumar came to this work soon after joining the Mayo Clinic. At the time, he was working with ultrasound images for diagnosing pre-term birth complications. When he noticed that ultrasounds were picking up different objects, he figured that they might be useful for classifying breast cancer images.

As he looked closer at the issue, he deduced that deep learning would be a good match. However, at the time, Kumar knew very little about deep learning. So he dove in, spending more than six months teaching himself everything he could about building and working with deep learning models.

“There was a drive behind that learning: This was a tool that could really help,” he said.

And help is needed. Breast cancer is one of the most common cancers, and one of the easiest to detect. However, in developing countries, mammogram machines are hard to find outside of large cities, primarily due to cost. As a result, healthcare providers often take a conservative approach and perform unnecessary biopsies.

Ultrasound offers a much more affordable option for far-flung facilities, which could lead to more women being referred for mammograms in large cities.

Even in developed countries, where most women have regular mammograms after the age of 40, Kumar said ultrasound could prove critical for diagnosing women who are pregnant or are planning to get pregnant, and who can’t be exposed to a mammogram’s X-rays.

Getting Better All the Time

Kumar is amazed at how far the deep learning tools have already progressed. It used to take two or three days for him to configure a system for deep learning, and now takes as little as a couple of hours.

Kumar’s team does its local processing using the TensorFlow deep learning framework container from NVIDIA GPU Cloud (NGC) on NVIDIA TITAN and GeForce GPUs. For the heaviest lifting, the work shifts to NVIDIA Tesla V100 GPUs on Amazon Web Services, using the same container from NGC.

The NGC containers are optimized to deliver maximum performance on NVIDIA Volta and Pascal architecture GPUs on-premises and in the cloud, and include everything needed to run GPU-accelerated software. And using the same container for both environments allows them to run jobs everywhere they have compute resources.

“Once we have the architecture developed and we want to iterate on the process, then we go to AWS [Amazon Web Services],” said Kumar, estimating that doing so is at least eight times faster than processing larger jobs locally, thanks to the greater number of more advanced GPUs in play.

The team currently does both training and inference on the same GPUs. Kumar said he wants to do inference on an ultrasound machine in live mode.

More Progress Coming

Kumar hopes to start applying the technique on live patient trials within the next year.

Eventually, he hopes his team’s work enables ultrasound images to be used in early detection of other cancers, such as thyroid and, naturally, ovarian cancer.

Kumar urges patience when it comes to applying AI and deep learning in the medical field. “It needs to be a mature technology before it can be accepted as a clinical standard by radiologists and sonographers,” he said.

Read Kumar’s paper, “Automated and real-time segmentation of suspicious breast masses using convolutional neural network.”

For more information: www.nvidia.com

This piece originally appeared as a blog post on NVIDIA’s website.


Related Content

News | Breast Imaging

April 15, 2026 — QT Imaging Holdings, Inc. has launched its QTI Imaging-Olea Viewer, developed in collaboration with ...

Time April 15, 2026
arrow
News | Radiology Imaging

April 7, 2026 — Onvida Health and Siemens Healthineers have entered a 10-year Value Partnership¹ designed to bring the ...

Time April 09, 2026
arrow
News | Computed Tomography (CT)

April 2, 2026 — GE HealthCare has received 510(k) clearance from the U.S. Food and Drug Administration (FDA) for True ...

Time April 03, 2026
arrow
News | Breast Imaging

April 1, 2026 — QT Imaging Holdings has released its latest image reconstruction software update, version 4.5.0. This ...

Time April 02, 2026
arrow
News | Ultrasound Imaging

March 30, 2026 — Butterfly Network, Inc. has received clearance from the U.S. Food and Drug Administration (FDA) for a ...

Time April 01, 2026
arrow
News | Computed Tomography (CT)

March 30, 2026 — HCA Healthcare’s Good Samaritan Hospital is the first hospital in the Bay Area to implement Lumina 3D ...

Time April 01, 2026
arrow
News | Breast Imaging

March 30, 2026 — Each year, the Alumni Association at the University of Missouri-Kansas City, recognizes the ...

Time March 31, 2026
arrow
News | Radiology Imaging

March 26, 2026 — GE HealthCare has announced a renewed research collaboration with Stanford Medicine Department of ...

Time March 30, 2026
arrow
News | Magnetic Resonance Imaging (MRI)

March 25, 2026 A Penn Medicine–led team has developed a first‑of‑its‑kind artificial intelligence system that interprets ...

Time March 26, 2026
arrow
News | Cybersecurity

March 23, 2026 —Sacumen has launched ConnectX, a unified AI platform that gives cybersecurity product companies full ...

Time March 25, 2026
arrow
Subscribe Now