Dave Fornell, ITN Editor
Dave Fornell, ITN Editor

Dave Fornell is the editor of ITN and DAIC magazines

Blog | Dave Fornell, ITN Editor | HIMSS | March 03, 2017 | Dave Fornell

Two Technologies That Offer a Paradigm Shift in Medicine at HIMSS 2017

augmented reality, virtual reality, medical imaging, surgery, Novarad, operating room

A view through a HoloLens augmented reality visor showing an overlaid brain image co-registered with a live patient on a table at HIMSS 2017.

Information technology (IT) is among the least sexy areas to cover in medical technology advances, and is often difficult to find really interesting news as I sift through more than 1,300 vendors at the massive annual Health Information and Management Systems Society (HIMSS) conference. However, at this year’s conference I found two exciting new technologies that I feel have the potential to become paradigm shifts in medicine. The first is the integration of artificial intelligence into medical imaging IT systems. The second, and coolest tech at HIMSS, was the use of augmented reality 3-D imaging visors to create a heads-up display of 3-D imaging anatomical reconstructions, or complete computed tomography (CT) or magnetic resonance imaging (MRI) datasets surgeons can use in the operating room (OR).

The medical imaging applications of augmented reality were shown by two vendors at HIMSS, although there were dozens of booths that had the same Microsoft HoloLens augmented reality visors with fun activities in attempts to draw in attendees. TeraRecon debuted its cloud-based augmented reality solution, the HoloPack Portal, which extends the TeraRecon 3-D viewing of CT scan anatomical reconstructions to true 3-D projected in a headset viewer to provide real 3-D image viewing using the Microsoft HoloLens visor. The system uses voice commands and finger movements to enlarge, shrink or rotate the 3-D images so surgeons do not have to break the sterile field in the OR.

Novarad showed a similar work-in-progress system using the HoloLens that allowed attendees to see registered CT and MRI datasets overlaid on a live patient on a table. Using hand and finger gestures in the air in front of the visitor, the user can go through the dataset slice by slice, or change the orientation of the slices. Novarad also showed a video of how the system would work in a real OR so surgeons can see the underlying anatomical structures for real-time navigation without needing to look at a reference screen across the room and ask someone else to change the view to what they need.

After attending HIMSS, I actually feel energized about the prospects artificial intelligence (AI) may offer medicine. But, unlike the science fiction image that snaps into most people’s minds when you talk about AI, it will not be a cool, interactive, highly intelligent robot that will replace doctors. In fact, most users will not even be aware AI is assisting them in the backend of their electronic medical record (EMR) systems. AI is a topic that has been discussed for a few years now at all the medical conferences I attend, but I saw some of my first concrete examples of how AI (also called deep learning or machine learning) will help clinicians to significantly reduce time and workflow efficiency. AI will accomplish this by working in the background as an overlay software system that sits on top of the  PACS, specialty reporting systems and medical image archives at a hospital.

The AI algorithms are taught through machine learning to recognize complex patterns and relations of specific types of data that are relevant to the image or disease states being reviewed. In one example I saw from Agfa’s new integration of IBM Watson’s AI, the system was smart enough to look at a digital X-ray image and realize the patient had lung cancer and evidence of prior lung and heart surgeries. It automatically searched for specific records for the patient from oncology treatments, cardiology, prior chest exams from various modalities, recent lab results and relevant patient information on their history of smoking.

Philips Healthcare showed its Illumeo adaptive intelligence software, which uses AI to speed workflow. The example demonstrated was for oncology, where a computed tomography (CT) exam showed several tumors. The user can hover and click on a specific piece of anatomy on a specific slice and orientation. The system then automatically pulls in prior CT scans of the game region and presents the images from each exam in the same slice and orientation as the current image. If the AI determines it is a tumor, the system also runs auto quantification of the tumor sizes from all the priors and presents them in a side-by-side comparison. The goal of the software is to greatly speed up workflow and assist doctors in their tasks.

AI is also making its appearance in business and clinical analytical software, as well as imaging modality software, where it can automatically identify all the organs and anatomy, orientate the images into the standard reading reviews and perform auto quantification. This is already available on some systems, including echocardiography for automated ejection fractions and wall motion assessments.  

Watch the VIDEO from HIMSS 2016 "Expanding Role for Artificial Intelligence in Medical Imaging."
 

Read the article from HIMSS 2017 "How Artificial Intelligence Will Change Medical Imaging."

Related Content

Schematic diagram of the proposed multichannel deep neural network model analyzing multiscale functional brain connectome for a classification task. rsfMRI = resting-state functional MRI.

Schematic diagram of the proposed multichannel deep neural network model analyzing multiscale functional brain connectome for a classification task. rsfMRI = resting-state functional MRI. Graphic courtesy of the Radiological Society of North America.

News | Artificial Intelligence | December 11, 2019
December 11, 2019 — Deep learning, a type of arti...
EMR patient portal on a smartphone
News | Electronic Medical Records (EMR) | December 11, 2019
December 11, 2019 — Despite the numerous benefits associated with patients accessing their medical records, a new stu
#RSNA19 A sophisticated type of artificial intelligence (AI) can detect clinically meaningful chest X-ray findings as effectively as experienced radiologists, according to a study published in the journal Radiology.

Image courtesy of GE Healthcare

News | Artificial Intelligence | December 03, 2019
December 3, 2019 — A sophisticated type of...
REiLI AI platform auto segmentation.
News | Artificial Intelligence | November 30, 2019
December 1, 2019 — Fujifilm Medical Systems U.S.A.
#RSNA19 Dynamics enables a radiologist to compare X-rays and provide automatically generated reports specifically addressing the changes in images over the course of patient treatment. Initially Dynamics feature will support longitudinal comparison for pneumothorax, consolidation, mass, nodule, pleural effusion, pulmonary edema and lung congestion radiological findings where progress reports are of greatest importance
News | Artificial Intelligence | November 29, 2019
November 29, 2019 — At RSNA19 Oxipit will offer a first public p
 CAE Healthcare will showcase its mixed reality training solutions for practicing physicians and medical imaging companies for the first time at the Radiological Society of North America (RSNA) 2019 meeting. With technology platforms that integrate modeled human physiology into immersive, augmented reality environments, CAE Healthcare partners with vendors to deliver risk-free training solutions that meet the needs of physicians and equipment providers. #RSNA19 #RSNA2019
News | Virtual and Augmented Reality | November 27, 2019
November 27, 2019 — CAE Healthcare will showcase its mixed reality training solutions for practicing physicians and m