Access and Feeds

Data Decay: The Impact of Poor Data Quality can be Expensive

By Dick Weisinger

Data decay is the gradual loss of accuracy of stored data. A report from Biznology estimates that B2B data of things like titles, email addresses, and phone numbers change by as much as 5 percent monthly or more than 70 percent per year. By some estimates, data quality issues are costing the US as much as $3.1 trillion every year.

Data needs to be cleaned and checked for integrity on a regular basis in order to ensure that data is up-to-date. Poor data quality particularly expensive for marketing and sales. Data quality is often cited as one of the top issues for why companies are reluctant to use analytics, because they know that their underlying data has issues.

One way to combat data decay is to enact records management and data governance policies, and if possible, retention policies for data should be enacted and enforced to make sure that outdated information is disposed of.

Some vendors are beginning to offer AI algorithms that can help sniff out data problems. But, Kathy Rudy, chief data and analytics officer at ISG, warns that “there are no silver bullets. Even advanced technologies and approaches have their limitations. You can use AI to run queries against the data to look for mistakes and gaps, but humans have to decide what and how to fix decayed data or to remove it altogether.”

Ed King, CEO of Openprise, said that in order to “to compete in today’s data-driven economy, companies must maximize data quality and management. Since businesses runs on data, organizations are starting to recognize data quality and management as the unsung heroes of demand generation.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*