Greg Freiherr, Industry Consultant
Greg Freiherr, Industry Consultant

Greg Freiherr has reported on developments in radiology since 1983. He runs the consulting service, The Freiherr Group.

Blog | Greg Freiherr, Industry Consultant | Radiology Imaging| November 17, 2016

Data Management Will Change: The "How" Could Surprise You

machine learning

Image courtesy of Pixabay

Data are streaming faster and in greater quantities than ever before. These data take the form of pictures coming from imaging machines. They may be ECGs, digital slides from pathology, or vital signs captured by wrist-wrapped fitness devices.

How great would it be to process these data efficiently so we can use the analyses to help patients or boost department productivity?

It's obvious that collecting data simply for the sake of doing so makes no sense. But making sense of this torrent of data is not easy and it's only going to become more difficult. Inevitably, unless we change our ways, some data are going to be collected for no other reason than the requirement that those data are collected. And the result won't be pretty.

With the best intentions, many years ago, I "loaned" a source to a staff writer. I suggested that he call my source for background on a story I had assigned him to write. I later learned that the writer began the conversation with "I don't know why I'm calling you, but I have been told to do it."

It was an eye-opener for me not only in my relationship with that writer but in the realization that unless a person chooses to go a certain way, the trip will be neither productive nor enjoyable.

 

Overcoming An Old-School Mindset

People being what we are, clearly more is needed than just introducing data streams to an operation or requiring their analyses. The staff being impacted by those streams must buy into their use. Doing so will make the difference between connectivity and productivity, thinking in analog versus digital.

How we think about something determines not only what we do but how we do it. Managers now face the prospect of delegating analytic duties to machines.

Deep learning algorithms (a.k.a. machine intelligence) are gliding under the radar into our daily lives. They make daily tasks easier. An example of this happens every time Google auto completes a question based on its understanding of our interests or throws ads at us on the basis of what we have recently searched or may be interested in.

In the future, deep learning algorithms will be available to help make sense of the increasingly numerous and complex data streams being sent our way. Choosing what analyses will be delegated — and which will remain under human control — is only one type of question that must be answered.

I suspect that regardless of which analyses are performed by machines, unless people embrace their use, the analyses will be given less weight than those made by people. This won't change until the human-machine culture changes. And this has begun.

As an old schooler, I have been a little annoyed when passing someone who is speaking conversationally into a cell phone earpiece. To be honest, I typically mistake these conversations as being directed at me, causing me to verbally respond or in some way engage with the stranger, only to pull back with a (hopefully) inaudible apology.

I like to think that I have been getting better at distinguishing personal conversations. I expect this to continue as I experience more people talking to machines. And there is plenty of this going on.

We are talking to computer surrogates, Siri and Cortana. And to Amazon's Alexa via Echo (and Echo Dot). I am, in fact, dictating this blog to a speech recognition program integrated with my computer.

Increasing familiarity with machines is necessary, if we are to become fully immersed in the digital world of tomorrow and to make full use of the data it can provide. We are already hard-pressed to keep up with the extraordinary volumes of data streaming at them. If we are to use machines to make sense of these data in the future, we will need to improve the way we interact with machines.

 

Changing How We Think

We may soon depend on visual analytics to make sense of data. Historically we have done so with bar graphs and line charts. These are being leveraged to help us understand collections of data vis-à-vis dashboards.

As we use voice ever more to interface with our computing and mobile devices, we will hear progressively more synthesized voices responding. It has even been proposed that, in the future, analyses — visual and audible — may be accompanied by music that varies like the musical soundtrack of a movie. This music will be added to elicit certain kinds of responses. Rather than sad or happy, data soundtracks might indicate positive or negative analyses with varying levels of intensity inferring whether action should be quick or can be delayed.

Regardless of how analyses are done in the future, by whom or what, and how these analyses are presented, it will be designed to immerse us in the data. And that immersion may change the way we think.

 

Editor's note: This is the third blog in a four-part series on Changing the Look of Radiology. The first blog, "Risk Abatement May Determine the Future of Radiology" can be found here. The second blog, "How Radiology Can Improve Outcomes and Make Medicine Better," can be found here. The series can be found here.

 

Related Content

Sponsored Content | Case Study | Enterprise Imaging | November 16, 2018
Centricity Clinical Archive Analytics uses Microsoft Azure and Power BI Embedded to derive intelligence from the data
Lunit Unveiling AI-Based Mammography Solution at RSNA 2018
News | Mammography | November 15, 2018
Medical artificial intelligence (AI) software company Lunit will be returning to the 104th Radiological Society of...
Life Image and Mendel.ai Bringing Artificial Intelligence to Clinical Trial Development
News | Artificial Intelligence | November 15, 2018
Life Image and Mendel.ai announced a new strategic partnership that will facilitate the adoption and enhancement of...
News | Enterprise Imaging | November 14, 2018
Konica Minolta Healthcare Americas Inc. will showcase new features and tools for the Exa Enterprise Imaging platform at...
Artificial Intelligence Predicts Alzheimer's Years Before Diagnosis
News | Neuro Imaging | November 14, 2018
Artificial intelligence (AI) technology improves the ability of brain imaging to predict Alzheimer’s disease, according...
Researchers Awarded 2018 Canon Medical Systems USA/RSNA Research Grants
News | Radiology Imaging | November 13, 2018
The Radiological Society of North America (RSNA) Research & Education (R&E) Foundation recently announced the...
MDW Unveils First Radiology Blockchain Platform at RSNA 2018
News | Radiology Business | November 13, 2018
Medical Diagnostic Web (MDW) will debut the first radiology blockchain platform designed to connect all players in the...
Subtle Medical Showcases Artificial Intelligence for PET, MRI Scans at RSNA 2018
News | Artificial Intelligence | November 13, 2018
At the 2018 Radiological Society of North America annual meeting (RSNA 2018), Nov. 25-30 in Chicago, Subtle Medical...
ContextVision Introduces AI-Powered Image Enhancement for Digital Radiography
Technology | Artificial Intelligence | November 09, 2018
With the integration of deep learning technology, ContextVision takes digital radiography to new levels with its latest...
Ambra Health Launches Mobile App for Instant Medical Image Access
Technology | Mobile Devices | November 09, 2018
Ambra Health announced the launch of its first iOS mobile app for healthcare providers and patient access. Designed...