topleft topright
What To Do About Bad Data Print E-mail
Share This -
Digg
Delicious
Slashdot
Furl it!
Reddit
Spurl
Technorati
YahooMyWeb
Article Index
What To Do About Bad Data
The Cost of Bad Data
Preventing Errors
Who Should Be Responsible For Data?

The Cost Of Bad Data


Data is believed to double every 12 to 18 months. So if you think your company is buried in data now, it will only get worse. Adding to this problem is the fact that there is so much bad data in use. Every organization has it, says Redman-it is an "equal opportunity peril" that costs organizations between 10 to 20 percent of revenue. Redman identifies seven common data quality issues and the benchmarks for each one.



Chart: Seven Common Data Quality Issues


Hot Topics
Shareholder activists have highlighted nine key issues for the 2006 proxy season.
Issue Benchmark
1. People can't find the data they need Knowledge workers spend 30% of their time searching data they need, unsuccessfully half the time.
2. Incorrect data 10-25% of data record contains inaccuracies.
3. Poor data definition Data frequently misinterpreted, can't connect data from different departments.
4. Data privacy/data security All data subject to loss, risk of ID theft.
5. Data inconsistency across sources The norm when there are multiple databases.
6. Too much data Half of all data never used for anything, uncontrolled redundancy.
7. Organizational confusion Can't answer basic questions such as:
How much created each day?
Which are most important


Even a low data error rate is costly. Redman cites his rule of 10: "It costs ten times as much to complete a unit of work when the input data are defective (late, incorrect, missing, etc.) as it does when the input data are perfect." Even if your data is accurate 99 percent of the time, a one percent error rate can lead to extremely high costs (in one example, Redman shows how the costs of an operation can double with a 1 percent error rate). Moreover, poor data is the "number-one reason for the high failure rate of new computer systems," according to Redman. Gartner analyst Ted Friedman predicted, in reports he wrote in 2003 and 2004, that through 2007 "more than 50 percent of data warehouse projects will experience limited acceptance, if not outright failure, because they will not proactively address data quality issues."


Next: Preventing Errors




 
Share This -
Digg
Delicious
Slashdot
Furl it!
Reddit
Spurl
Technorati
YahooMyWeb
< Previous   Next >




Copyright © 2007-2016 CIOZones. All Rights Reserved. CIOZone is a property of MMINC Digital Inc.