Concepts and Methodologies of Data Validation


Explain the concepts and methodologies of data validation. Think in terms of data feeds from an EMR/EHR that ultimately ends up in an enterprise-wide data warehouse. How can data validation ensure data quality and integrity? What are recommendations on how to ensure only validated data ends up in consumable form?

As healthcare organizations (HCO’s) begin to leverage data to improve quality initiatives and overall performance. Disparities in how data is created, shared, and stored create barriers to analyze data. Organization must focus on a data management to improve the way data is entered and stored (Strome, 2013). Kerr, Norris, and Stockdale (2008) suggest that the reactive approach in healthcare today is due to poor data management. Primarily users and outside data sources (e.g. electronic records from a referring provider) are a major cause of these disparities.

Total data quality management (TDQM) is a methodology that takes a proactive rather than a reactive approach to improving data quality. Marsh (2004) provides a four step approach that can be leveraged to improve current data disparities in healthcare. The steps include: audit, clean, error prevention, and compliance. Auditing the data entails reviewing current data for any discrepancies such as missing values. Clean, refers to removing errors. Error prevention is the step that includes staff education, and compliance is continuous monitoring of data consistencies. Leveraging different methodologies to improve data is the first step to improving and maintaining consistency in data quality.

One of the most important factors in ensuring data ends up in consumable format is by proper training of end users and implementing data stewards. According to Juran “data are of high quality if they are fit for use in their intended operational decision making, and other roles (as cited in Strome, 2013).” Improving communication with end users and sharing details on analytical processes will improve end user understanding. As well, training will also ensure that inappropriate workarounds are avoided in order to improve the quality of data. Data stewards are individuals who maintain and monitor current processes of how data is used, entered, and stored. Maintaining consistency across multiple functions within a HCO will help alleviate disparities and improve data quality. Moreover, data stewards work closely with analytics teams to determine when systems should be replaced and how current systems are being used.  This provides analytics teams with the necessary information needed to implement change as needed (Strome, 2013).

With health information systems being used for different functions across organizations from care management to revenue cycle management. Organizations require strong governance strategies to improve enterprise wide processes. Whereas, in the past the focus was how to improve localized processes. The environment of healthcare requires that organizations focus on the entire enterprise to improve overall quality and performance initiatives. If organizations can work as an enterprise rather than in isolation they will prosper in years to come (Strome, 2013)

References:

Kerr, K., Norris, T., & Stockdale, R. (2008). The strategic management of data quality in healthcare. Health Informatics Journal, 14(4), pp. 259-266. doi: 10.1177/1460458208096555


Marsh, R. (2004). Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management. Database Marketing & Customer Strategy Management, 12(2), pp. 105-112. Retrieved from: http://search.ebscohost.com.proxy.cc.uic.edu/login.aspx?direct=true&db=buh&AN=16249141&site=ehost-live

Strome, T.L. (2013). Data quality and governance. Healthcare Analytics for Quality and Performance Improvement (pp. 6-7). Hoboken, New Jersey: John Wiley & Sons Inc.



Popular Posts