Access and Feeds

Data Quality: The Cost of Bad Data

By Dick Weisinger

It’s too easy to get data wrong. Bad data is a costly problem. Poor data quality (DQ) can lead to bad outcomes and dire consequences for businesses.

Analysts are in agreement that poor data costs companies a lot. A study by IBM found that US businesses lost $3.1 trillion annually because of bad data quality. Gartner estimated an average annual cost of $12.9 million for poor data quality. Ovum estimates that the cost could be as high as 30 percent of the revenue for a business.

Why are data errors expensive?

Melody Chien, Senior Director Analyst at Gartner, said that “data quality is directly linked to the quality of decision making. Good quality data provides better leads, a better understanding of customers, and better customer relationships. Data quality is a competitive advantage that D&A leaders need to improve upon continuously.”

Correcting data errors is expensive. Thomas Redman in an HBR article calls the correction process a “hidden data factory.” For example, a study by Melissa estimates that 20 percent of contact information collected by companies is not correct. Fixing contact information early on follows a 1-10-100 rule, according to Melissa. Data can be checked on a first pass by comparing it to a database — that costs about $1 per record. An additional $10 is needed to fix records that weren’t found in the first correction pass, and $100 is the cost attributed to misplaced shipments, returned mail, and lost opportunities due to incorrect data.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*