Search Data Management published today an article commenting a recent survey by
Gartner on data quality issues. In the article, called
Poor data quality costing companies millions of dollars annually, Jeff Kelly comments that according to a recent survey conducted by Gartner, the average organization surveyed said it loses $8.2 million annually through poor data quality.
Ted Friedman, the Gartner analyst that conducted the research, said in an interview: Much of this loss is due to lost productivity among workers who, realizing their data is incorrect, are forced to compensate for the inaccuracies or create workarounds when using both operational and analytic applications.
Some numbers from the Gartner study:
- in the 140 companies surveyed, 22% estimated their annual losses resulting from bad data at $20 million. Four percent put that figure as high as an astounding $100 million.
- losses could be even higher were it not for the increasing adoption of data quality tools. According to Gartner, the data quality tools market grew by 26% in 2008, to $425 million.
- around 50% of survey respondents said they are using data quality tools to support master data management (MDM) initiatives, and more than 40% are using data quality technologies to assist in systems and data migration projects.
- less than half of respondents currently use data quality tools at the point of capture or creation, which often happens in operational systems, like CRM software.
Some comments and considerations by Ted Friedman:
- The tools are not cheap, so people are doing the right thing by finding multiple ways to use them.
- To improve data quality throughout the organization, vendors must make data quality tools simpler to use so that business types can use them and begin taking responsibility for the quality of their own data.
- Providing data profiling and visualization functionality (reporting and dashboarding of data quality metrics and exceptions) to a broader set of business users would increase awareness of data quality issues and facilitate data stewardship activities
- Historically, data quality tools have been most often used in an offline, batch mode -- cleansing data at a point in time outside the boundaries of operational applications and processes. Gartner advises clients to consider pervasive data quality controls throughout their infrastructure, ensuring conformance of data to quality rules at the point of capture and maintenance, as well as downstream.
- Organizations are increasingly applying data quality to data domains other than customer data, but more still needs to be done. The quality of financial data in particular costs some companies considerable money in the form of fines for incorrect regulatory filings.
The data quality is one of the most important issues in the organizations nowadays. All the organizations around the world have problems with the data quality, to a greater or lesser degree. They need to have data accurate, because the organizations can have a well implemented BI, but if their data are not accurate, they will make decisions using unreliable information.
No comments:
Post a Comment