The hidden (but real) link between data validation and care. Plus, 4 practices for doing it right.

So, you determine that physician access to legacy clinical data is a priority, particularly with respect to your most acute patient population. More visibility to a complete patient history, more informed decision-making in today and tomorrow’s care. Makes sense.

The legacy data is migrated to an active archive or converted to your EMR for use by your physicians. They begin to access the data, and pause, something doesn’t look right. Or worse, something important is missing and they don’t know it’s missing. Their workflow is interrupted, or new ones are initiated. Questions ensue.

Is this a one-time thing or is it systemic? What is the negative impact to caring for an acute patient population? Will the impact reach all the way to Length of Stay (LOS) or other metrics used for determining reimbursement? What about physician satisfaction and patient experience? How much productivity is lost? How many people are diverted from other priorities to “fix” the problem?

Unfortunately, data migration projects are notorious in their negative impact to organizations. According to a published report from Gartner, more than 50% of such projects exceed budget and/or result in disruption to the business because of flawed execution.

Dig a little deeper into the problem and you’ll find that a top contributor is a poorly executed validation of the legacy data... a critical step in the migration or conversion process. Some of your peers, maybe even you, have experienced the ripple effect of a poorly executed data validation process. While the surprise cost and dread of fixing data quality is painful, the negative impact to the organization can reach all the way to the experience of clinicians and patients.

But it doesn’t have to be that way for you, your team, or your patients when migrating your legacy data. And it won’t be that way with some planning and old-fashioned discipline in the execution of your data validation process.

There are four primary methods used in the best practice validation testing process for a legacy clinical data migration: 

1) Statistical testing.

Clinical data from legacy systems is migrated to the new location in a consistent fashion; during each system's migration, the integrity of the migrated data is validated using statistical comparisons with information obtained from the legacy system.

2) Referential testing.

The clinical data obtained from the legacy system is checked throughout the migration process to ensure all linkages and relationships between data tables are intact and that the content integrity within each table is valid. This requires some grit. MediQuant uses more than 700 points of validation in this type of testing process. 

3) Parallel testing.

A list of randomly selected accounts representative of the overall inventory is prepared for parallel (comparison) testing against the source system. Be sure to work with your business users to determine how many accounts they would like included in the sample data set, so they can have ownership in the results. Sometimes they can see things in the data that you and your team cannot see. Working collaboratively, the probability of identification and correction of discrepancies will dramatically improve.

4) Exception testing.

Exception tests hunt for discrepancies in data at a higher level that may not show in parallel testing. Things such as: number of accounts without primary diagnoses or physicians, number of accounts out of balance, number of patients without birthdates, and number of duplicate records. These are just a few of the issues often uncovered in validation testing. In the case where exceptions are identified, work with your business users on how to resolve the exceptions appropriately. 

Make sure your team uses all four methods in every legacy clinical data migration process. This type of redundancy in your validation testing is a critical part of avoiding dirty data in the new system, working within budget and protecting against the disaster business disruption scenario we read about in the intro of this post.

Lastly, keep in mind that large data migrations are complex in nature. With dozens, or even hundreds of legacy systems, that complexity increases exponentially. If you and your team need the peace of mind found in relying on a third-party expert, or you simply don’t have the resource bandwidth to meet the migration project demands in a timely manner, let me know. Don’t wait until physicians and patients feel the impact of data validation ‘gone wrong’ to do something about it.

Have any projects, experiences, or questions you’d like to share? Any ideas you may want to co-blog about? If so, hit me up in the comments section or feel free to email me @ Dr.Kel@mediquantone.com.

I’m Dr. Kel Pults, your Blog host. A special ‘thank you’ goes out to Shelly Disser, VP of Solution Delivery at MediQuant, who contributed to this blog. You’ll be hearing more from her in the future! Stay on the lookout for our next piece. Until then, like, share and/or comment on this post!