When the Obama administration authorized the law that would fund the expansion of electronic health records (EHRs), the idea was that this would transform American healthcare. Records would become readily portable and individuals and their doctors would have easy access to this information when it’s needed most. Instead, a decade later, the US government has poured over $36 billion into EHR development and adoption and our healthcare system is as fragmented as ever. Many hospitals have even discovered that their systems work so poorly that using EHRs is a hindrance to patient care. It’s a system in crisis. Faced with this issue and with a rapidly changing healthcare system, tech experts are now considering an alternative solution to the EHR problem: artificial intelligence. AI is steadily becoming an important part of our healthcare system and while some of its functions are simple, such as the integration of AI-based workflow in order to improve practice efficiency, but other combinations of EHR and AI are rather more complex. As AI begins to play a greater role in diagnosis and treatment for those with limited access to insurance and traditional healthcare, AI may also be able to help EHRs live up to their potential.
Why AI Matters
AI, which can also be spoken about in terms of machine learning, is all about pattern recognition. It studies enormous amounts of data to find trends, moments of repetition, and correlation to produce consistent outcomes. That might mean predicting what follows a common sequence of events or identifying moments when an action breaks its pattern. This pattern recognition is what makes AI-based health services so reliable. Programs like HealthTap, for example, let users access its “Dr. AI” for symptom assessment – no doctor required. There’s also a growing market for virtual nursing assistants. These AI nursing programs can help prevent hospital readmissions and reduce return doctor appointments, saving everyone time and money.
AI Meets EHRs
Run in conjunction with EHRs, AI developers hope to enhance how hospitals use their software in several ways. Many of these aim to streamline the overall office environment, such as data entry and dictation. After all, most doctors already use voice dictation software to manage their notes, but these programs are limited by their inability to differentiate between similar sounding words, to understand natural grammar, and many only function in a small set of complementary programs. EHRs with built-in AI that uses natural language processing (NLP) would make it easier for doctors to dictate information for patient charts and to add keynotes directly into the EHR using a program that understands context. This is much more advanced that standard dictation software which just recognizes sounds. One practice that has already adopted AI-based dictation for their EHRs is OrthoAtlanta, a 14 office group with 37 practicing physicians. Since making this change, the office has seen note completion time drop to 1.6 minutes from an average of 4.8 minutes. Considering how many cases a practice of this size sees in a day, that time savings adds up quickly. Another way that hospitals are combining the powers of EHRs and AI is to identify the source of major infections. Using the combined library of time and location stamps of over 90,000 patients, the University of California-San Francisco Health was able to identify the source of a clostridrium difficile infection spreading through their system. C. diff can lead to sepsis and can be fatal in vulnerable patients, which made tracing the infection back to its source absolutely vital to caring for the entire hospital population. The same approach could be applied to hospital acquired infections more broadly. Finally, AI-EHR partnerships may help solve one of the biggest issues with EHR software today – test submissions and results. In too many cases, EHRs fail to submit tests to the laboratory or to properly alert doctors of new results. This has caused serious delays in diagnosis and harm to patients. Not only is new AI able to identify test submission and result in notification as part of the treatment pattern, but these programs are also able to participate in reading diagnostic imaging. Depending on the type of imaging or test, AI may actually be better at reading and interpreting test results than individual doctors. At the very least, these programs can help doctors identify where the problem might be. EHRs could be a powerful force for improving healthcare, but the way that they’re currently designed has been a real hindrance and prevented the existing programs from working well together. By adding AI into the mix, though, EHRs has a built-in safety system, a program designed to catch mistakes on the part of the doctor or the software. And after spending so much money trying to implement EHRs, AIs are making them a tool for saving money by aiding in diagnosis and patient management. This is how we change healthcare – by combining two significant bodies of data, sharing strengths, and compensating for their weaknesses to provide patients with the best possible care.