This article is initially available on our podcast. Click here to listen.

I wanted to share the most recent experience I had with a client trying to do a denials analysis. Working with one of their technical people, one of the challenges, of course, is “How do you get technical people, functional people in terms of billers or billing managers, and more to talk to each other for it to work for everybody?” 

What’s going on with data analysis?

Somebody technical is going to have to implement frequently some data wrangling, data analysis. Maybe, you have an analyst. Maybe, you have an outside person who does some of the work, depending on how large your organization is. Perhaps, a billing manager or somebody like that is trying to do some work in Excel. However, all this is happening. The endgame is that we want to get answers. However, somewhere along the way, things break down every so often.

I’m hoping that we can derive a good lesson from this particular case. This is the second time I’m sitting down with a technical person who’s got the list of objectives and what we’re trying to answer, like what we’re trying to figure out. They pull up that list and say, “Okay, we’re trying to answer these things. I’m having a hard time calculating this.” I had to say, “Whoa, whoa, whoa, whoa! Stop! Let’s go back to your data. Is your data clean? Have you gotten your data in a position where you can do any analysis at all?”

Overcoming data analysis challenges

So we go back in and look and see that they had to parse out data, rejection, and denials from a text field. And there was a whole bunch of problems where they hadn’t solved and had their data in an excellent place to do analysis. And they were trying to get that done. This is a frequent challenge.

In this case, for example, there were a ton of denials, essentially, that had not had codes parsed out. Without getting too technical, the regex hadn’t picked up the pattern correctly to identify what the code was and pull that out in all of that text garbage. So it was missing a bunch of them that were there. It’s just that they hadn’t been identified and pulled out.

What about denials?

There was a whole series of them as well, where there was a denial. Some of them said, for example, that the payer doesn’t accept the paper claims or that something else isn’t allowed, but there was no code associated with it. So if you’re looking for a code and that’s the pattern lookup that you’re looking for, then you’re not going to get anything. It just misses all of those records.

In others, it didn’t find the code lookup because it misidentified the distinction between CARCs and RARCs and some of these things, where it saw a pattern and thought it was one thing, when in fact, it was another. So the rules weren’t set up correctly to make sure that that got stripped out correctly like an N30. So there was no 30 in the reason codes. Of course, it was because of an N30.

What about claims?

Suppose you don’t have all of that set up correctly. In that case, there are tons and tons and tons of claims where you’d have no information or incorrect information, and you’re already trying to calculate something and analyze it before you even have the data set up correctly.

I think everybody’s in such a rush, and understandably, to try to get to answers. But if we don’t do the heavy lifting and spend all of the time up-front to make sure the data is good, we’ll end up with many holes and much misinformation or even leading us down the incorrect paths.

Make sure your data is clean

We had to take a step back and say, “No, forget the analysis. Back up to get your data clean and get your data in good shape and then move forward. So walk to run.” I know it’s sometimes hard to do that, but if we don’t, then we get problems.

Sometimes, it’s not even visible. We’ve seen situations where somebody did some analysis and thought their top problems were not the top problems because the data was garbage. They’d been running down a path where they were solving problems, and it wasn’t moving the needle significantly in terms of how much better things were getting. They couldn’t understand why, and it was because their data wasn’t good. Therefore, they tackled the wrong problems and made minor improvements to the margin instead of attacking the top issues.

Final thought

Moral of the story: make sure all of the data is good. Just spend all of the time getting the data correct. Then, you’ll find that the analysis is easy because it can be done in a pivot table in Excel if all of that cleanup work is done correctly.