Abstract
The Data Quality Management Model (DQMM) is a model created to manage the quality of data (Wang, 2001). It is comprised of four tools, namely: application, collection, warehousing, and analysis. Application refers to the purpose for the data gathering. Collection refers to the accumulation of data elements (Wang, 2001). On the other hand, warehousing refers to the systems and processes applied to store data and journals, whereas analysis deals with translation of data into information (Wang, 2001). To test the various aspects of data quality, the four testing tools must be used. The DQMM deals with aspects of data quality such as: accessibility, consistency, currency, granularity, precision, accuracy, comprehensiveness, definition, relevancy, and timeliness (Wang, 2001). Therefore, for an Electronic Health Record to be accurate, all these aspects have to be addressed, and the tools mentioned above used appropriately. This paper provides a synopsis of how the DQMM is used to ensure data quality in Electronic Health Records.
Data accuracy is aimed at ensuring that data is valid and of the correct values. To ensure accuracy, the aim for which gathering the data is determined (Wang, 2001). Educating, training, and communication with those given the task to collect the data are then done. Thereafter, appropriate edits to measure accuracy, such as error reports are set up. The data is then analyzed accurately, by ensuring that formulas used are correct (Wang, 2001).
Data comprehensiveness is another very important aspect. To achieve it, clarity has to be made as to the intended uses of the data (Wang, 2001). A comprehensive cost-effective collection method is then determined. Data collectors, owners, and end-user are then notified of the availability of the data in the system. Finally, all data related to the application is then analyzed appropriately (Wang, 2001).
Consistency, on the other hand, involves ensuring that the value of data is similar across applications (Wang, 2001). Extensive training, data collection procedures, and integrated systems are then set to ensure consistency. Subsequently, conversion tables and edits are then used, leading to analysis of the data under standard circumstances (Wang, 2001).
Data currency is mainly concerned with making sure that data is up to date (Wang, 2001). Therefore, applications have to be changed over time to ensure their appropriateness. After that, the data definitions are changed and documented to go with the times. Subsequently, systems, databases, and tables are continually updated and documented (Wang, 2001). Analysis is then conducted on the current available data.
Data definition incorporates ensuring that data is defined clearly for the users. Clarification must be made on the aim of collecting the data (Wang, 2001). Clear data definition makes it easier for data to be collected accurately. Maintenance should then be conducted on the systems regularly, and analysis should then be done based on the displayed data (Wang, 2001).
In granularity, the attributes of data are expected to be defined correctly (Wang, 2001). Different levels of detail may be used while determining the purpose of data collection and during the actual data gathering (Wang, 2001). Warehouse data is stored with the appropriate level of detail, and analysis conducted depending on the levels of data granularity.
When it comes to precision, it is important to ensure that data values are just huge enough to support the process (Wang, 2001). Clarification is imperative to ensure that the purpose or aim of the data collection is clear. Acceptable data ranges or values also have to be made clear so as to ensure data precision.
Relevance mainly dictates that data should be meaningful to the application (Wang, 2001). The purpose for collecting the data element has to be clear. Moreover, the data collection instrument must be validated to ensure relevance. Thereafter, appropriate retention schedules should be set to ensure relevant data is available (Wang, 2001). Then, data is displayed for analysis so as to reflect the main purpose for which it was collected.
On the other hand, timeliness has to do with how data is used, and in which context. The application is what defines timeliness, and data is collected in relation to the process at that time. Data has to be made available as per the retention schedules and management policy, whereas timely analysis is conducted to avoid unpleasant impacts (Wang, 2001).
Reference list
Yng-Yuh Richard Wang, R. Y. (2001). Data quality. New York: Springer.