January 13th, 2023 | 6 min read
The cost of poor quality data and how to fix it
High-quality data is essential for every organization as it underpins its ability to drive process efficiencies, fuel innovation, acquire and retain customers and bring new products and services to market. The most advanced array of analytics and business intelligence tools available can’t support your business’s growth ambitions without a reliable and scalable supply of trustworthy data to consume.
While it is almost impossible to accurately estimate the cost of poor data quality as it is highly dependent on the nature and size of the business, and the extent of the problem, it has been reported by experts that poor data quality can cost businesses anywhere from 20% to 35% of their revenue. Ouch!
Poor data quality can be costly for businesses in a variety of ways. Some of the main costs associated with poor data quality include:
- Reduced operational efficiency: Poor data quality can lead to inaccurate or incomplete information, which can slow down or impede business processes, such as inventory management, marketing campaigns, and customer service.
- Increased costs for data cleansing and maintenance: Businesses may have to spend additional time and resources to cleanse and maintain their data, which can increase costs.
- Inaccurate decision-making: Poor data quality can lead to incorrect insights, which can result in poor decisions, missed opportunities, and lost revenue.
- Lost customers and decreased revenue: Poor data quality can lead to poor customer service, as businesses are unable to communicate effectively with their customers. This can lead to lost customers and decreased revenue.
- Legal issues and non-compliance: Poor data quality can result in businesses not being able to comply with legal and regulatory requirements, which can result in fines and penalties.
In order to avoid these costs, it is essential to have a strategy that clearly sets out what “good” data quality looks like for your organization, how it supports important business initiatives, and who is responsible for the quality and stewardship of business-critical data. This strategy should also cover how data quality will be assessed, how issues will be dealt with, and how data quality will be improved over time.
Of course, if it was that simple everyone would be doing it. There are a variety of challenges associated with ensuring data quality, including:
- Data volume and variety: With the growth of big data, businesses are dealing with larger volumes of data from a variety of sources, which can make it difficult to ensure data quality.
- Data complexity: Data can be complex and hard to understand, especially when it comes from multiple sources with different structures and formats.
- Data entry errors: Poor data quality can be introduced through human error during data entry, such as typos, transpositions, and other inaccuracies.
- Data integrity: Ensuring data integrity, or the consistency and accuracy of data across multiple systems and databases, can be problematic as it requires data to conform to entity, referential, and domain -specific rules and constraints which need to be enforced.
- Data Governance: A lack of data governance can lead to a lack of accountability, ownership, and inconsistent definitions of data. Inadequate data governance measures can also compromise the security of data, and lead to the mishandling of sensitive and personal data.
- Data Profiling: Identifying errors, inconsistencies, and patterns in data may require complex data profiling which necessitates specialized knowledge and tools.
Master data management (MDM) has long been touted as a means of addressing data quality issues as it focuses on the accuracy, consistency, and completeness of your master data. We would argue though that this doesn’t go far enough. Master data is the critical data that is used across multiple business functions, such as customer, product, and supplier data. Typically this data resides in structured files and systems and should be relatively easy for an MDM system to ingest. However, a lot of valuable data can be found in semi- and unstructured data such as emails, pdfs, and presentations. In order to create a Golden Record or single source of truth, MDM systems need to be able to handle all types of data from a multitude of sources.
As well as cleaning and standardizing data for use across the business, a modern MDM system should enrich data from external sources and increase data quality over time via a process of continual improvement. Modern MDM systems will also support Data Governance, by helping to establish and enforce ownership, accountability, categorization, and oversight of data.
Overall, master data management helps organizations to ensure that their data is accurate, consistent, and complete. By implementing MDM, organizations can improve the quality of their data, which in turn will improve the efficiency of business processes and the quality of decision-making.
To learn more about how MDM systems measure and increase data quality, read our white paper “The Metrics that Define Data Quality”.