Clinical data sourced from electronic health records (EHRs), labs, and health information exchanges (HIEs) contains unique and timely details on members such as vital signs, test results, diagnoses, immunizations, and more. Excited about the potential of clinical data to improve health outcomes and member satisfaction, optimize revenue, strengthen provider networks, and comply with pressing government mandates, health plans are increasing their investment in clinical data acquisition.
But once connections are made, the “pipes” are laid, and clinical data is rolling in, the journey to operationalize and capitalize on this investment has just begun. Across the board, when we assess the quality of the data our clients are acquiring, upwards of 50% of it cannot be used in its raw native form. Once acquired, it must be integrated into data management systems and normalized, enriched and deduplicated for successful deployment across downstream applications. Current approaches to data quality improvement involve skilled data scientists and analysts wielding multiple tools to make clinical data structurally and semantically normalized and interoperable. The sheer volume of clinical data can’t readily be handled using these traditional approaches, and with a continual influx of data and changes to terminologies and standards, the work is never done. As such, these approaches are costly, inefficient, and limiting.
In fact, the clinical informatics team at Diameter Health is intimately familiar with the requirements to keep up with evolving national standards and terminology and to ensure that mappings are clinically correct. We incorporate this clinical know-how into our automated Fusion solution so users can “leave the driving to us” and benefit from Fusion’s demonstrated scalability and throughput.
To quantify the return on investment of using an automated approach to data integration and normalization, my colleagues Lucy Parente and Chun Li partnered with a large national health plan to develop an Operational Cost Avoidance Model. This health plan had invested millions in clinical data acquisition but recognized that because of systemic clinical data variability and volume they wouldn’t be able to cost-effectively achieve the promise of this investment. The informatics team contributed data on resource and time allocations required for Upcycling Data™ based on nearly a decade of work in this endeavor as well as number of terms per clinical domain based on the dataset provided, and the customer contributed document count, member count, and number of existing resources involved in data quality activities based on their extensive experience.
The model can be used to calculate the cost that can be avoided by deploying Diameter Health’s automated Fusion technology in place of manual methods. Using the model, our partner calculated that for just a subset of its membership, to process 1.96 million continuity of care documents (CCDs) at an average of 2.62 documents per member would avoid up to $370 million annually in investment. Using the model, we estimate it could cost a mid-sized health plan up to $75 million for manual resources to perform data improvement functions.
With Diameter Health’s sub-second processing per document, versus the hours required for an analyst, organizations achieve a much faster time to value. The model doesn’t even consider the ancillary value of redeployment of skilled resources, consistent, repeatable processes, increased trust in data, and scalability for future growth.
To understand how health plans are speeding time to value and avoiding significant costs to make data usable, download the Whitepaper.