Provider Data Quality Mass Update Tool

Building a Data Cleaning Tool That Leverages McKesson Data Sets

Challenge

Our client bought a new Provider
Information Management System (PIMS)
from McKesson in order to improve the
quality of the existing data set, but they
encountered multiple issues with the
implementation. The component that was
supposed to update mass amounts of data
had limitations and was not able to use
specific business rules to implement
custom logic.
In order to use McKesson services, they
weren’t able to directly touch/meddle with
the database of the system, so they
bought a solution from another vendor
that was supposed to fix these issues. The
solution wasn’t working and time was still
running.
Our client needed a system that would
consume and update data from various
sources and then process them in a
predetermined path, so they decided to
build it on their own.

Solution

In a limited amount of time we had
built a data gateway that would
prevent ingestion of corrupted or
duplicated data from third party
systems.
The new tool is able to:
– ingest external data
– apply configurable business
rules
– compare it against live PIMS
data
– allow either automatic or
manual updates of the PIMS
data
In addition, the tool provided reporting
capabilities on the volumes, data
trends and timelines.

benefits

With this custom framework, our client was
able to enhance the member and prospect
experience, but also respond to the California
Department of Managed Health Care
expectation for a more timely data validation.
We were able to deploy the solution in 9
months, which means it was delivered in less
than 40% of the time initially allocated for this
project (the client initially started working
with another vendor, but switched to us since
they could not afford to have delays).
Overall, the use of this tool allowed an almost
complete workflow automation, resulting in
substantial resource savings on the client side
(more than 10,000 working hours saved).