Will AI Replace Healthcare Coders? Everything You Need to Know

Will AI Replace Healthcare Coders? Everything You Need to Know

See this article by Penstock president Laura Collier as it originally appeared in ICD10monitor.

It’s important to be realistic about what AI can and can’t do currently.

Artificial intelligence (AI) has been a buzzword in payment integrity for years. Companies are increasingly using AI and data mining to identify healthcare savings. This technology can be helpful in the right scenarios.

Data versus Nuance

AI, which includes technologies like machine learning, works well in certain settings involving relatively straightforward claims data. It is good at analyzing data, recognizing patterns and outliers, compiling statistics, and making calculations in scenarios where the appropriate code assignment is obvious and there are no gray areas. But not every scenario is strictly data-driven. Some situations need to be evaluated and interpreted, and that means they require human experience and intelligence.

When a patient’s chart is reviewed, clinical decisions governing diagnosis and treatment are made by a physician. This includes the creation of notes verifying what actually happened and precisely for what the payor is paying. Physicians are required to understand and document the situation, defining the evaluation methods and processes utilized to care for the patient and determine the level of care provided, patient risk categories and circumstances, etc. Artificial intelligence is not always capable of picking up on the nuances required for this. Medical records still need to be reviewed by humans to determine things that data alone may not be able to answer.

AI cannot use human experience to make decisions. It is not well-suited for situations that require logical reasoning or interpretation of notes.

A Matter of Intent

Accurate coding should recreate a physician’s thought process. Machines may be able to identify and analyze codes and keywords, but they cannot interpret intent. Take, for example, the selection of a principal diagnosis on an inpatient claim, through which the coder is supposed to identify the condition that was found, after study, to have occasioned the admission. This requires some interpretation of what’s going on in the patient’s stay that may not be accurately captured in the kind of data points that machine learning relies on – in other words, discrete pieces of data, such as specific codes or keywords.

AI can collect the data, but it takes a human reader capable of asking questions to put that data together into a story that accurately reflects the physician’s intent and the patient’s experience. This may involve discussion with a physician to find out whether certain factors were ruled in or ruled out during the encounter. If a particular keyword shows up in a record, does it mean that factor was ruled out? AI may not be able to tell.

Gray Areas

Relying on AI for broad-stroke information can also be problematic. For example, say you have a particular facility whose AI-reviewed data is showing a high rate of codes for certain complications. At first pass, this facility looks like an outlier, and could potentially be a problem. When a human reviews the data, it turns out that the facility is a trauma center. So, the codes were appropriate, but the AI mistakenly flagged it as an anomaly.

These problems are a measure of the complexity of the healthcare field. Coding has so many variables that it sometimes requires a bit of human ingenuity to figure out what is really going on for any given case. This is especially true in situations where there is a gray area around the coding process, such as in cases of sepsis, where the criteria are not always simple and clear-cut. Even in the simplest situations, coding and reimbursement are extremely complex, and technology alone cannot guarantee improvements in payment integrity.

One sure way in which AI can benefit a payment integrity program, however, is by tracking and flagging more common errors. This in turn frees up resources to allow your team to focus on the more complex situations that AI may not be as good at handling.

Until AI starts to look more human-like – think Data from Star Trek, as opposed to Alexa – the best payment integrity work will come from a combination of humans and machines. When experienced coders are equipped with the best tools, you get the best of both worlds.

Featured Company
Focus Areas
  • Redefining Administrative Efficiency
Experts In This Article
Laura Collier
Laura Collier
President of Payment Integrity Solutions, Penstock

Nick McLaughlin
CEO, Breeze

Share
Featuring Solutions For
Custom CTA designs for this insight
Get the latest digital download or fill out our contact form to talk to an expert about this type of content. This module is optional and meant to allow Insight pages to act as landing pages when appropriate contact form
Related Insights
Break up with the status quo

We’re igniting real change. From building and accelerating revolutionary healthcare businesses to delivering transformative tech and services to our clients, we’re raising the bar of what healthcare can be.

Ready to join us?