Leveraging Technology to Change the U.S. Healthcare System
It’s been no secret that the American healthcare system needed to change to survive economically into the future, but the question has been how this massive undertaking could be accomplished. The Affordable Care Act (ACA) was born out of concern over annual healthcare expenditures growing at unsustainable rates, projections that Medicare would no longer be able to pay its bills in the coming years and data showing that other countries offer higher levels of care at lower costs per patient. While the bill has created a firestorm of controversy and dissent, it does outline a pathway to overhaul the current method of conducting business to a healthcare system based on empirical data and workflow efficiency. These aims are slowly being achieved through the use of health information technology (IT).
Despite its faults, if nothing else the ACA has forced all providers to finally abandon archaic paper, film and CD filing systems and to adopt new electronic medical record (EMR) systems (PACS, CVIS and ECG management included) to make patient data more accessible and to improve turnaround times. Healthcare has generally failed to take full advantage of computing power to improve business efficiency like most other industries over the past three decades, and it took a federal mandate to force the issue.