News | Ultrasound Imaging | October 28, 2020

Northwestern Medicine Introduces Artificial Intelligence to Improve Ultrasound Imaging

Northwestern Memorial Hospital first in U.S. to adopt Caption AI in emergency department, ICU and cardio-oncology clinic  

A sonographer at Northwestern Memorial Hospital using the Caption Health AI to optimize ultrasound imaging.

A sonographer at Northwestern Memorial Hospital using the Caption Health AI to optimize ultrasound imaging.

October 28, 2020 — Northwestern Memorial Hospital is the first hospital in the United States to purchase Caption Health’s artificial intelligence (AI) technology for ultrasound, Caption AI. The FDA cleared, AI-guided ultrasound system enables healthcare providers to acquire and interpret quality ultrasound images of the heart, increasing access to timely and accurate cardiac assessments at the point of care.

Performing an ultrasound exam is a complex skill that takes years to master. Caption AI enables clinicians — including those without prior ultrasound experience — to quickly and accurately perform diagnostic-quality ultrasound exams by providing expert turn-by-turn guidance, automated quality assessment and intelligent interpretation capabilities. The systems are currently in the hospital’s emergency department, medical intensive care unit, cardio-oncology clinic and in use by the hospital medicine group.

"Through our partnership with Caption Health, we are looking to democratize the echocardiogram, a stalwart tool in the diagnosis and treatment of heart disease,” said Patrick McCarthy, M.D., chief of cardiac surgery and executive director of the Northwestern Medicine Bluhm Cardiovascular Institute, a group involved in the early development of the technology. “Our ultimate goal is to improve cardiovascular health wherever we need to, and Caption AI is increasing access throughout the hospital to quality diagnostic images.”

Caption AI emulates the expertise of a sonographer by providing real-time guidance on how to position and manipulate the transducer, or ultrasound wand, on a patient’s body. The software shows clinicians in real time how close they are to acquiring a quality ultrasound image, and automatically records the image when it reaches the diagnostic-quality threshold. Caption AI also automatically calculates ejection fraction, or the percentage of blood leaving the heart when it contracts, which is the most widely used measurement to assess cardiac function.

“Northwestern Medicine has been a tremendous partner in helping us develop and validate Caption AI. We are thrilled that they are bringing Caption AI into key clinical settings as our first customer,” said Charles Cadieu, chief executive officer and co-founder of Caption Health. “The clinical, economic and operational advantages of using AI-guided ultrasound are clear. Most important, this solution increases access to a safe and effective diagnostic tool that can be life-saving for patients.”

Point-of-care ultrasound (POCUS) has a number of benefits. Increased usage of POCUS contributes to more timely and accurate diagnoses, more accurate monitoring and has been shown to lead to changes in patient management in 47% of cases for critically ill patients.[1] POCUS also allows patients to avoid additional visits to receive imaging, as well as providing real-time results that can be recorded into a patient’s electronic medical record.

"I think the most exciting part is that Caption AI allows our intensive care unit (ICU) providers to do a point-of-care, real-time ultrasound for a sick patient,” said James “Mac” Walter, M.D., associate program director for the pulmonary and critical care medicine fellowship at Northwestern Medicine, who first used the technology on COVID-19 patients in Northwestern Memorial’s ICU. “It’s a way to integrate two worlds — real-time point-of-care ultrasounds and urgent care in the ICU — with details that are ready for cardiologists when they need them.”

The Bluhm Cardiovascular Institute committed to investigating the role of AI and machine learning in the diagnosis and treatment of cardiovascular disease in 2018. Its Clinical Trials Unit was one of Caption Health’s first partners, investigating the effectiveness of Caption Guidance in a clinical trial based, in part, out of Northwestern Memorial Hospital that served as the basis for the software’s landmark FDA authorization earlier this year.

In addition to being used to diagnose and treat patients, Caption AI will be used by fellows who typically need months of practice to learn accurate echocardiography techniques in real time.

Video demo of how the AI technology works to optimize ultrasound imaging.  

For more information: captionhealth.com, feinberg.northwestern.edu/sites/bcvi-ctu

 

Reference:

1. David P Hall, Helen Jordan, Shirjel Alam, et al. The impact of focused echocardiography using the Focused Intensive Care Echo protocol on the management of critically ill patients, and comparison with full echocardiographic studies by BSE-accredited sonographers. Journal of the Intensive Care Society. First Published March 28, 2017. https://doi.org/10.1177/1751143717700911.

 

Related Content

Images, or a digital twin mitral valve of a patient, created from cardiac ultrasound that were used to perform a virtual surgical procedure to test how the intervention would impact the patient prior to actually performing the procedure. The right image shows color coding for sheer stresses on the valve leaflets before and after the virtual surgery. The left image shows the model quantitation of leaflet coaptation at peak systole prior to the the virtual surgery.

Images, or a digital twin mitral valve of a patient, created from cardiac ultrasound that were used to perform a virtual surgical procedure to test how the intervention would impact the patient prior to actually performing the procedure. The right image shows color coding for sheer stresses on the valve leaflets before and after the virtual surgery. The left image shows the model quantitation of leaflet coaptation at peak systole prior to the the virtual surgery. Read the original article in Plos One.

Feature | Ultrasound Imaging | July 28, 2021
Outside of medicine, computer-generated virtual twins of real machines like cars or airplanes have been used in engin
Researchers from the School of Biomedical Engineering & Imaging Sciences at King's College London have automated brain MRI image labeling, needed to teach machine learning image recognition models, by deriving important labels from radiology reports and accurately assigning them to the corresponding MRI examinations

Getty Images

News | Magnetic Resonance Imaging (MRI) | July 28, 2021
July 28, 2021 — Researchers from the School of Biomedical Engineering & Imaging Sciences at...
Videos | Artificial Intelligence | July 22, 2021
This is an overview of trends and technologies in radiology...
An example of Viz.AI's pulmonary embolism AI application and mobile alert to the physician on-call. Viz.AI and Avicenna.AI Partner to Launch Artificial Intelligence Care Coordination for Pulmonary Embolism and Aortic Disease

An example of Viz.AI's pulmonary embolism AI application and mobile alert to the physician on-call. 

News | July 21, 2021
July 21, 2021 — Artificial int...
Registration is now open for the Radiological Society of North America (RSNA) 107th Scientific Assembly and Annual Meeting, the world’s largest annual radiology forum, to be held at McCormick Place Chicago, Nov. 28 – Dec. 2, 2021

Getty Images

News | RSNA | July 21, 2021
July 21, 2021 — Registration is now open for the Radiological Society of North America (...
Artificial intelligence-powered diagnostic tool spots asymptomatic prostate cancer in seconds

(L-R) Associate Professor Peter Brotchie (St Vincent's), Dr Ruwan Tennakoon (RMIT), Professor John Thangarajah (RMIT), Dr Mark Page (St Vincent's). Image courtesy of St Vincent's Hospital Melbourne

News | Prostate Cancer | July 19, 2021
July 19, 2021 — Prostate cancer is the most diagnosed
According to ARRS’ American Journal of Roentgenology (AJR), return to routine screening for BI-RADS 3 lesions on supplemental automated whole-breast US (ABUS) substantially reduces the recall rate, while being unlikely to result in adverse outcome

Normal right mediolateral oblique (A) and craniocarudal (B) view screening mammograms demonstrate density C breasts. Coronal (C), transverse (D), and reconstructed lateral (E) views from supplemental automatic breast ultrasound (ABUS) demonstrates 7 mm circumscribed slightly hypoechoic circumscribed lesion at 11 o’clock position in right breast. Lesion was classified as BI-RADS 3. Patient has undergone yearly mammograms for 4 years following the ABUS examination with no breast cancer diagnosis.

News | Breast Imaging | July 16, 2021
July 16, 2021 —...
An example of HeartFlow's FFR-CT analysis of blockage severity in a patient's coronary vessels based on a cardiac CT scan.

An example of HeartFlow's FFR-CT analysis of blockage severity in a patient's coronary vessels based on a cardiac CT scan.

News | Cardiac Imaging | July 15, 2021
July 15, 2021 — HeartFlow, which has commercialized noninvasive...