Great data quality, great impact

    Nina Beauchez

    How to improve your organization's data quality to achieve impact.

    The impacts of poor data quality have been studied for quite some time. Twenty years ago, in 1998, Redman talked about the consequences of poor (digital) data quality for enterprises. According to him, the consequences include customer dissatisfaction, increased operation cost, less effective decision-making, and a reduced ability to execute strategy. Furthermore, during the nineties researchers tried to develop frameworks to determine data quality and how to improve it. Now, some 20 years later in a time of rapid digitalisation, there is more data collected by organisations than ever before. This, however, doesn’t mean that there is an overall improvement of data quality since the 1990’s. Therefore, we are taking a closer look into data quality.

    ‘Good’ vs. ‘poor ‘data quality

    To strive for good quality data, we need to answer the question of what defines good data quality. According to the ISO/IEC 25012 data standard, good data quality is defined as: “The degree to which a set of characteristics of data fulfills requirements”. You might be wondering requirements from whom? Some authors focus on the user: “High quality data can refer to whether data meets the expectations of the users”. The user in this definition does not only refer to humans, it also refers to other systems. For example, you might have data, which links two applications or systems together through an API (Application Programming Interface). In this case the “user” is another system and the expectation is that through data the systems can be linked.  

    From both of these definitions it, becomes clear that the quality of data is related to the ‘usability’ of the data. This means that ‘poor’ quality data can be defined as data that does not fulfill the requirements of the user or is not usable. However, this seems to result in a black and white division of good versus poor data quality. The reality is that there are still many possibilities for analysis with ‘poor’ data, therefore we should regard data quality as a spectrum in which improvement can always take place.

    Characteristics of data quality

    At Data4development we apply the same definition as the ISO/IEC 25012, which  states that data quality is measured by a set of characteristics. The questions can help you to identify the quality of the data set as well as improve your data. The following characteristics are recognized:

    • Completeness: ‘Is all data recorded?’ Data is complete when it contains sufficient value to be used for its purpose.
    • Validity: ‘Is the data cleaned? Is it correct and useful?’
    • Accuracy: ‘Are the data values correct?’ Data is accurate when it is verifiable by an authoritative source for its intended purpose.
    • Consistency: ‘Is the data produced in a consistent, regulated way?’
    • Availability: ‘Is the data available to end users and applications, when and where they need it?’
    • Timeliness: ‘Is the data up-to-date?’

    This framework is applicable to all (quantitative) data as it helps you to identify where improvement of data(sets) can occur. The improvement of data quality should be seen as an ongoing process, to assure the correct outcomes.

    The possibilities with lesser quality data

    You could look at your data as ingredients, which you want to make a dish of to meet all your expectations. For example if you want to make a great lasagna; it would be easier to make the dish like you expected when you have great ingredients. However, you can still make a pretty good lasagna with some lesser quality ingredients and some great cooking skills.

    This is comparable to data quality. Great data quality will meet the requirements of the user. However, the requirements can still be met with the lesser quality data, it just will require more effort. However, if your ingredients are moldy and rotten, it will be impossible to create a world class lasagna. Therefore, it is always better to try to make sure your data quality improves or, at the minimum, is maintained.

    Data4Development can guide you to improved data quality

    However, do not despair if your data lasagna fails, as Data4Development has worked with many nonprofits and NGOs on improving their data quality to make data-driven decisions while focusing on the most important characteristics to improve your data as soon as possible. So, your organization can get the insight that it needs for data driven decision making.

    Not only does Data4Development offer consultancy on data standards, sourcing and IT-vision; we have recently launched our Data Workbench. The Data Workbench can improve your data quality by verifying your IATI data sets. With the Data Workbench we offer you one platform to access the tools you need: a workbench to gather data, prepare and verify data sets, to help you use and publish data that is relevant, accurate and IATI compliance. This is ideal for sharing your results in a transparent manner as well as improving your organization’s impact. For more information on our products and services click here. What better way to become a data masterchef?

    - Levitin, A.V. and Redman, T., “Data as a resource: properties, implications, and prescriptions”, Sloan Management Review, Vol. 40 No. 1, pp. (1998: 89-101).

    - ISO/IEC 25012: 2008 defines a general data quality model for data retained in a structured format within a computer system.

    - L. Sebastian-Coleman, Measuring data quality for ongoing improvement: a data quality assessment framework (2012).

    - Measuring Data Quality for Ongoing Improvement by Sebastian - Coleman (2013).

    Leave a comment

    Your email address will not be published.