Improve Data Quality for Competitive Advantage

Reading Time: 18 min 


Already a member?
Not a member?
Sign up today

5 Free Articles per month, $6.95/article thereafter. Free newsletter.


Unlimited digital content, quaterly magazine, free newsletter, entire archive.

Sign me up

During the past several decades, managers have expended great effort to stay abreast of the latest information technologies (IT). Despite this, managers still do not have the accurate, timely, and useful data they need to be effective. Data failures are embarrassing and costly. Recent published examples include lawsuits filed to protect consumers from incorrect credit reports, incorrect payment of municipal taxes, and rebates due to incorrect product labeling. No industry — communications, financial services, manufacturing, health care, and so on — is immune. Nor is government.

We at AT&T Bell Laboratories QUEST have initiated a broad-based program to determine the root causes of poor quality data and develop simple, effective strategies for mitigating them.1 Our ultimate goal has been to improve data and information quality by orders of magnitude and create an unprecedented competitive advantage. We have found that:

  • Many managers are unaware of the quality of data they use and perhaps assume that IT ensures that data are perfect. Although poor quality appears to be the norm, rather than the exception, they have largely ignored the issue of quality.
  • Poor quality data can cause immediate economic harm and have more indirect, subtle effects. If a financial services company cannot get my social security number right, I will seriously question its ability to manage my money. Mistrust grows when the data from one department, say, order entry, and used by another, say, customer billing, are flawed.
  • Poor data in financial and other management systems mean that managers cannot effectively implement strategies.
  • Inaccurate data make just-in-time manufacturing, self-managed work teams, and reengineering infeasible. The right data need to be at the right place at the right time.
  • Due largely to the organizational politics, conflicts, and passions that surround data, only a corporation’s senior executives can address many data quality issues. Only senior managers can recognize data (and the processes that produce data) as a basic corporate asset and implement strategies to proactively improve them.

The relatively simple strategies I present here are directly applicable to all data-intensive industries. Their conscientious implementation can vastly improve data quality. At AT&T, the focus on data has led directly to reengineering opportunities and reduced costs. In particular, programs with suppliers (local telephone companies) have greatly improved the quality of bills, at reduced cost to both the supplier and data user. Telstra Corporation, the telecommunications company in Australia, is emphasizing improvements to data quality to help improve service.

Read the Full Article



1. The only similar program we are aware of is the Total Data Quality Management Program at the MIT Sloan School, under the direction of Stuart Madnick and Richard Wang.

2. T.C. Redman, Data Quality: Management and Technology (New York: Bantam Books, 1992).

3. G. Kolata, “New Frontier in Research: Mining Patterns from Data,” New York Times, 9 August 1994, pp. A19–21; and

W.M. Bulkeley, “Databases Are Plagued by a Reign of Error,” Wall Street Journal, 26 May 1992, p. B6.

4. B. Knight, “The Data Pollution Problem,” Computerworld, 28 September 1992; and

L. Wilson, “The Devil in Your Data,” InformationWeek, 31 August 1992, pp. 48–54.

5. G.E. Liepens, “Sound Data Are a Sound Investment,” Quality Progress, September 1989, pp. 61–64;

G.E. Liepens and V.R.R. Uppuluri, eds., Data Quality Control: Theory and Pragmatics (New York: Marcel Dekker, 1990); and

M.I. Svanks, “Integrity Analysis,” Information and Software Technology 30 (1988): 595–605.

6. D.P. Ballou and G.K. Tayi, “Methodology for Allocating Resources for Data Quality Enhancement,” Communications of the ACM 32 (1989): 320–329;

J.R. Johnson, R.A. Leitch, and J. Neter, “Characteristics of Errors in Accounts Receivables and Inventory Audits,” Accounting Review 56 (1981): 270–293;

K.C. Laudon, “Data Quality and Due Process in Large Interorganizational Record Systems,” Communications of the ACM 29 (1986): 4–18;

R.C. Morey, “Estimating and Improving the Quality of Information in a MIS,” Communications of the ACM 25 (1982): 337–342;

E.T. O’Neill and D. Vizine-Goetz, “Quality Control in On-line Databases,” in M.E. Williams, ed., Annual Review of Information, Science, and Technology 23 (1988): 125–156;

G.E. Liepens, R.S. Garfinkel, and A.S. Kunnathur, “Error Localization for Erroneous Data: A Survey,” TIMS/Studies in Management Science 19 (1982): 205–219;

K.J. Sy and A. Robbin, “Federal Statistical Policies and Programs: How Good Are the Numbers?” in M.E. Williams, ed., Annual Review of Information Science and Technology 25 (1990): 3–54; and

R. Barstow, “Centel Bashes Database Errors,” Telephony, 28 January 1991, pp. 36–39.

7. C.J. Fox, A.V. Levitin, and T.C. Redman, “The Notion of Data and Its Quality Dimensions,” Information Processing and Management 30 (1994): 9–19; and

R.Y. Wang, D.M. Strong, and L.M. Guarascio, “Data Consumers’ Perspectives on Data Quality” (Cambridge, Massachusetts: MIT Sloan School of Management, TDQM Working Paper 94-01, May 1994).

8. Access services, provided by local telephone companies connect end users to Interexchange Carriers.

9. M. Light, “Data Pollution Can Choke Business Process Reengineering,” Inside Gartner Group This Week, 23 April 1993, pp. 5–6.

10. A.V. Levitin and T.C. Redman, “A Model of Data (Life) Cycles with Applications to Quality,” Information and Software Technology 35 (1993): 217–224.

11. T. Levitt, The Marketing Imagination (New York: Free Press, 1986);

W.E. Deming, “Out of the Crisis” (Cambridge, Massachusetts: MIT Center for Advanced Engineering Study, 1986);

A.V. Feigenbaum, Total Quality Control (New York: McGraw-Hill, 1983);

K. Ishikawa, Introduction to Quality Control (Tokyo: 3A Corporation, 1990);

J.M. Juran, Managerial Breakthrough (New York: McGraw-Hill, 1964); W.A. Shewhart, Statistical Method from the Viewpoint of Quality Control (Washington, D.C.: Graduate School of the Department of Agriculture, 1939); and

H.M. Wadsworth, K.S. Stephens, and A.B. Godfrey, Modern Methods for Quality Control and Improvement (New York: Wiley, 1986).

12. Process Quality Management & Improvement Guidelines, 1.1 (Indianapolis: AT&T Customer Information Center, 1988).

13. Improving Data Accuracy: The Data Tracking Technique (Indianapolis: AT&T Customer Information Center, 1992).


The author acknowledges the help of Patricia Baker, Rebecca Bennett, Sandra Fuller, Monica Mehan, Kerin Montgomery, Robert Pautke, Carol Spears, John Tomka, and Scott Williamson at AT&T; Dennis Flentje, Brian Fuller, Gerry Gorry, Grant Salmon, and Lynne Wickens at Telstra; present and former members of the Information Quality group, Stu Tartarone and Ralph Wyndrum, Jr., of AT&T Bell Laboratories’ QUEST organization, and an anonymous referee.

Reprint #:


More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.