Greg Freiherr, Industry Consultant
Greg Freiherr, Industry Consultant

Greg Freiherr has reported on developments in radiology since 1983. He runs the consulting service, The Freiherr Group.

Blog | Greg Freiherr, Industry Consultant | Artificial Intelligence | October 04, 2017

Don’t Expect Too Much From Artificial Intelligence

artificial intelligence

In the 1980s I was enthralled with hybridomas. Created by fusing human tumor cells with cells from mouse spleens that had been sensitized to the patient’s own cancer, they were supposed to pump out the magic bullets that finally would take cancer down. At the time, I believed it.

Thirty years later, the jury is still out.

Immune technology is taking some impressive steps. But it has not lived up to early expectations.

It’s a lesson worth remembering when considering artificial intelligence (AI). Yes, machines are getting more efficient. And yes, algorithms are shouldering some of the more menial tasks. But so far they haven’t come close to fomenting revolutions. There is even serious doubt as to how “smart” they really are.

This question goes to the root of our definition of smart, which has gotten pretty loose lately. (If you refer to your cell as a smartphone, you know what I mean.)

Maybe that’s a good thing when it comes to AI. It sets the bar at a level that machines can get over.

 

Watson or Holmes?

The story goes that Sherlock Holmes and Dr. Watson were camping when, suddenly, both men awoke.

“Look up and tell me what you see,” Holmes said.

“I see the starry heavens above,” Watson replied.

“And what do you deduce from that?”

“That we are small and insignificant beings in a vast cosmos.”

“No!” Holmes shouted, exasperated. “Someone has stolen our tent!”

Will smart machines play the role of Watson or Holmes? Will they wax poetically on the grand issues? Or draw conclusions that are immediately applicable to the current situation?

I’m excited by the possibility that smart algorithms might take radiology beyond the qualitative — adding data analytics to the interpretive process. Or that they might meld traditionally nonradiologic information, for example, genetic or genomic data. But I realize this might not be possible.

Past experience with hyped technologies — especially medical ones — has jaded me.

 

Shortfalls

Despite public triumph in 2011 as a contestant on the TV game show “Jeopardy,” IBM’s AI system Watson has fueled that. Last spring, venture capitalist Chamath Palihapitiya went so far as to call Watson a “joke.” (“IBM’s Watson ‘is a joke,’ says Social Capital CEO Palihapitiya,” CNBC, May 9, 2017.)
That may be too harsh.

There is no denying that Watson has struggled. One of its most reported failures involved MD Anderson Cancer Center, which late last year pulled out of its deal to help develop the Watson-powered Oncology Expert Advisor. Four-year costs that exceeded $60 million and less than spectacular performance were two of the reasons cited. (“Big Data Bust: MD Anderson-Watson Project Dies,” Medscape, Feb. 22, 2017.)

An IBM spokesperson defended the project, however, as a success, saying that Oncology Expert Advisor “likely could have been deployed had MD Anderson chosen to take it forward.” (“MD Anderson Benches IBM Watson In Setback For Artificial Intelligence In Medicine,” Forbes, Feb. 19, 2017.)

It’s important to note that the Oncology Expert Advisor is not the same as the IBM Watson for Oncology product. IBM has placed this product for clinical use in at least one hospital. According to the company, Watson for Oncology, which was “trained” by Memorial Sloan Kettering, will be used to help personalize treatment for breast, lung, colorectal, gastric, cervical and ovarian cancers. (“Jupiter Medical Center Implements Revolutionary Watson for Oncology to Help Oncologists Make Data-Driven Cancer Treatment Decisions,” IBM press release, Feb. 1, 2017.)

 

Getting Real

When it comes to radiology, the use of smart machines to analyze images is obvious. But AI’s biggest impact might come from algorithms that improve human performance.

AI algorithms are already being groomed to perform mundane tasks otherwise done by technologists, for example, automating scan techniques so staff can stay on schedule. This prevents waiting room delays that can infuriate patients, while keeping throughput high.

Further exemplifying improved efficiency are “no-wait” AI localizers that recognize anatomical landmarks in scout images. With these, the MR scanner can automatically position data capture, set the size of fields of view (e.g., Hitachi), even optimize scan times (e.g., Siemens’ GoBrain, Philips’ Dixon five-minute abdomens). These algorithms are not being written only for new machines, but also for upgrades of older generations.

The potential exists to develop AI algorithms that streamline worklists so as to reduce the time between exam and interpretation. Directing specific cases to certain radiologists could substantially improve efficiency.

The bottom line: Delegating simple tasks to machines makes sense. (See “Must Radiologists Be Prepared To Delegate ... To Smart Machines?” at http://bit.ly/2jlok2s.)

It is at this nexus of people and machines that the greatest gains are to be made.

I’m hoping that, as AI evolves in the coming few years, developers will resist the temptation to aim at — and hype — the grandiose, choosing instead to focus on practical issues.

That would be really smart.

Related Content

In April, the U.S. Food and Drug Administration (FDA) cleared Intelerad’s InteleConnect EV solution for diagnostic image review on a range of mobile devices.
Feature | PACS | May 27, 2020 | By Melinda Taschetta-Millane
Fast, easily accessible patient images are crucial in this day and age, as imaging and medical records take on a new
An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019.

An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019. Photo by Dave Fornell.

News | Ultrasound Imaging | May 26, 2020
May 12, 2020 — DiA Imaging Analysis, a provider of AI based ultrasound analysis solutions, said it received a governm
 Recently the versatility of mixed and augmented reality products has come to the forefront of the news, with an Imperial led project at the Imperial College Healthcare NHS Trust. Doctors have been wearing the Microsoft Hololens headsets whilst working on the front lines of the COVID pandemic, to aid them in their care for their patients. IDTechEx have previously researched this market area in its report “Augmented, Mixed and Virtual Reality 2020-2030: Forecasts, Markets and Technologies”, which predicts th

Doctors wearing the Hololens Device. Source: Imperial.ac.uk

News | Artificial Intelligence | May 22, 2020
May 22, 2020 — Recently the versatility of
In response to the significant healthcare delivery changes brought on by COVID-19, Varian has launched new capabilities for its Noona software application, a powerful tool designed to engage cancer patients in their care for continuous reporting and symptom monitoring.
News | Radiation Oncology | May 21, 2020
May 21, 2020 — In response to the significant healthcare delivery changes brought on by...
NucleusHealth, a provider of cloud-based medical image management technology and teleradiology services, announced today that it has received Conformité Européene (CE) Mark approval for Nucleus.io.
News | Teleradiology | May 21, 2020
May 21, 2020 — NucleusHealth, a provider of cloud-based medical image management technology and teleradiology service
Actionable insight “beyond the diagnosis” enables health researchers to better understand COVID-19 progression, intervention effectiveness, and impacts on healthcare system
News | Coronavirus (COVID-19) | May 20, 2020
May 20, 2020 — Change Healthcare introduced ...
Examples of chest CT images of COVID-19 (+) patients and visualization of features correlated to COVID-19 positivity. For each pair of images, the left image is a CT image showing the segmented lung used as input for the CNN (convolutional neural network algorithm) model trained on CT images only, and the right image shows the heatmap of pixels that the CNN model classified as having SARS-CoV-2 infection (red indicates higher probability). (a) A 51-year-old female with fever and history of exposure to SARS-

Figure 1: Examples of chest CT images of COVID-19 (+) patients and visualization of features correlated to COVID-19 positivity. For each pair of images, the left image is a CT image showing the segmented lung used as input for the CNN (convolutional neural network algorithm) model trained on CT images only, and the right image shows the heatmap of pixels that the CNN model classified as having SARS-CoV-2 infection (red indicates higher probability). (a) A 51-year-old female with fever and history of exposure to SARS-CoV-2. The CNN model identified abnormal features in the right lower lobe (white color), whereas the two radiologists labeled this CT as negative. (b) A 52-year-old female who had a history of exposure to SARS-CoV-2 and presented with fever and productive cough. Bilateral peripheral ground-glass opacities (arrows) were labeled by the radiologists, and the CNN model predicted positivity based on features in matching areas. (c) A 72-year-old female with exposure history to the animal market in Wuhan presented with fever and productive cough. The segmented CT image shows ground-glass opacity in the anterior aspect of the right lung (arrow), whereas the CNN model labeled this CT as negative. (d) A 59-year-old female with cough and exposure history. The segmented CT image shows no evidence of pneumonia, and the CNN model also labeled this CT as negative.  

News | Coronavirus (COVID-19) | May 19, 2020
May 19, 2020 — Mount Sinai researchers are the first in the country to use...