Image: Butterfly Network
Jan. 6, 2026 — Butterfly Network, a provider of handheld, whole-body ultrasound and intuitive software, has announced plans to release its Beam Steering API for advanced third-party AI application and technology development within the company’s Butterfly Garden and Ultrasound-on-Chip co-development programs. The new API release, expected in the first half of 2026, will open access to core capabilities that were previously reserved for the company’s own app.
The basis of the Beam Steering API is Butterfly’s electronically steered 3D imaging software, including off-axis beam tilt of up to 20 degrees. The technology can support advanced applications that reduce reliance on precise probe positioning, helping clinicians capture high-quality images more easily and consistently.
As portable ultrasound moves earlier in the care journey and into more clinical settings, these capabilities become increasingly important. Expanding use in point-of-care, ambulatory and frontline environments makes usability a defining factor, especially for clinicians of varying skill level to acquire high-quality images.
“As the first to offer digital beam steering access to developers, Butterfly is changing the game for imaging AI, empowering more advanced tools that simplify image acquisition,” said Steve Cashman, Chief Business Officer of Butterfly Network. “As our platform evolves, Butterfly partners gain access to more of our core imaging capabilities, allowing them to build richer, more sophisticated applications. That ecosystem dynamic can accelerate development and drive adoption by helping ultrasound reach more providers, earlier in care and across more settings.”
Unlike legacy ultrasound systems, Butterfly is built on a semiconductor chip using a fully electronic 2D capacitive micromachined ultrasonic transducer (CMUT) array with approximately 9,000 elements, enabling software-defined beam steering across all three dimensions without mechanical motion.
Victor Ku, PhD, Chief Technology Officer of Butterfly Network, added, “Because our beam steering is implemented entirely in software on a semiconductor chip, we already support advanced imaging modes like Biplane, iQ Slice, and iQ Fan on our own platform. That same architecture is what allows us to thoughtfully open these core capabilities to third-party developers through the new API.”
The planned SDK access is expected to support advanced imaging workflows across a subset of existing presets, including abdominal, cardiac, obstetric, musculoskeletal, vascular, and lung imaging. Availability, scope, and technical details will depend on ongoing development, regulatory considerations, and partner agreements.
Developers interested in exploring use cases or sharing ideas for this upcoming release are encouraged to inquire through Butterfly Garden at www.butterflynetwork.com/ai-developers
December 23, 2025 