Master Data Expert Explains How to Achieve Quality Data for MDM

Tom Redman’s Best Practices for Improving Data Quality

Master data is quickly becoming the most important data companies have. It’s the foundational information about an organization’s customers, vendors, and prospects, and is the basis for making smart business decisions. As companies look to create and maintain reliable and trusted sources of master data across all of their all systems, applications, and processes, we’re seeing a huge investment in Master Data Management (MDM) tools. But getting any real value from these platforms means having quality data first and foremost. At the end of the day, ensuring the consistency, structure, and interoperability of the data is not the responsibility of an MDM platform, but everyone across the organization. But that can be a huge challenge, as we’ve seen from the results of our recent research study.

As Tom Redman, “the Data Doc,” explains in Getting in Front on Data, the secret lies in getting the right people in the right roles to “get in front” of the management and functional challenges that lead to bad data in the first place. We asked Mr. Redman – who possesses nearly 30 years of experience in data quality, and serves as an advisor and innovator at Data Quality Solutions – to share his thoughts on the importance of data quality and its role in MDM.

You talk about the importance of data quality in your new book, and while most people recognize it’s a challenge, are you surprised the majority of companies we talked to believe their data quality is very high?

Redman: Not really. Absent a clear reason to believe otherwise, such as measurements or feedback from customers, it is only natural to think quality is pretty good. Absent hard evidence to the contrary, everyone rates their data as well above average. 

I help a lot of companies make their first real measurements. And almost without exception, results are far worse than they suspected.

What steps should companies take to improve their overall data quality on the road to having strong master data?

Redman: The simple answer is “adopt the philosophy of getting in front and put the right people in the right roles.” The two most important roles are data customer and data creator and, once people step into these roles, quality improves quickly. You need some organizational infrastructure to make this happen at scale, including data quality teams, embedded data managers, and some real expertise, people I call “maestros.” And let’s not forget leadership.

How do you define the term "data customer," and why is that so important?

Redman: By a “data customer,” I mean anyone who uses data, which is just about everyone in the organization. Now, most people know that the quality of what they do depends on the quality of data they use. So they look for errors and make corrections. This happens day-in and day-out, and pretty soon they view it as just another part of their job. You can even admire the extraordinary lengths people (and departments) take to deal with bad data.

But that’s not being a good customer! Good customers grow increasingly intolerant of bad data, and work with data creators. Good customers sort out what’s most important, communicate their needs to data creators, and help those creators eliminate the root causes of error. It may sound counterintuitive, but it works remarkably well and is certainly easier than correcting errors.

You’ve also spoken at great length about the “data provocateur.” What’s that all about?

Redman: As I noted earlier, most simply accept bad data as part of their jobs. Provocateurs are different. They see the inherent insanity in continuing to correct bad data and opine that “there has to be a better way.” They try getting in front and demonstrate that quality can be proactively improved. And this example is provocative to the rest of the company.

I don’t know of a great data quality program that started without a provocateur. Companies that have them are very lucky! 

Disparate data sources—or data silos, which you called the “enemy of data quality”—are a frequent challenge we see facing companies today. How can organizations begin to break down those walls?

Redman: Two things here. First is simply people reaching out to one another across silos. For data quality, this means customers reaching out to creators and vice-versa. It almost always works! But too many people are afraid to take the first step. My advice is simply, “Get over it.”

The second tool for breaking down barriers is process management, and I am a big fan [of it]. Think of it this way. Most management is done vertically up and down the organization chart. Process management complements this with horizontal management, knitting departments and functions together in the direction of the customer. 

Data permeates every part of an organization, which makes eradicating these silos so important. But who do you think ultimately should own the data?

Redman: I’m not a big fan of data “ownership.” The plain-English sense of the word conveys rights that are antithetical to what we’re trying to achieve in the data space. For example, if I own something, I can let quality slip, I can sell it, and I can deny access to others. Exactly what we don’t want. Worse, the practical reality today is that whatever department creates or obtains the data acts as though they own it. Note that this issue and issues of data silos go hand in hand.

Instead, companies should unequivocally state that data is “owned” by the enterprise. And they should establish the roles I talked about earlier for quality; establish parallel roles for privacy, security, and access control. This is difficult stuff and it will take time. But I don’t see how companies that claim they wish to “manage data assets” or “become data-driven” can skip them.

What role does the C-level team play in data quality?

Redman: Quality programs go as far and as fast as the senior manager, or group of managers, demand. So if the C-level team wishes the company to enjoy the many benefits of high-quality data, it must provide the leadership necessary to make it happen. This means it must adopt the philosophy of getting in front of data quality issues, put the appropriate people and structures in place and, in time, make clear that all must participate.

What mistakes are companies making in their Master Data Management?

Redman: MDM projects usually come up because systems don’t talk to one another. And it gets treated as a tech problem, without getting to the underlying reasons that systems don’t talk. First, they weren’t designed, architected, and built to do so. And second, systems don’t talk because people don’t share a common language. Until companies address these deeper issues, MDM projects will be Band-Aids, at best.

The notion of not speaking a common language, that is essentially the issue with most companies’ data, isn’t it? It is defined and categorized in so many different ways across the enterprise. What can companies do to ensure their data is using one universal language, or one common version of the truth, for instance?

Redman: This is an important and nuanced question. Keep in mind that companies are made of specialized departments, doing specialized work, and requiring specialized languages. For example, in an oil company, geologists use different terms and the same terms in different ways than, let’s say, accountants. Forcing them to speak a common language will mean you get poor geology and worse accounting. Not a good idea. Each specialty requires its own language, and that language should be reflected in its data. 

At the same time, people do need to work together from time to time, and managers need to develop a broader view. So it is a good idea to define a common language. Geologists have their language; accountants have theirs; and there is a common language for working together, developing a larger perspective, and the like. It is really important that companies recognize these separate needs and strike the right balance.

Finally, it bears mention that a common language is the foundation on which quality master data rests. 

What do you believe the future of MDM will be? Can companies achieve this without a technology platform down the road?

Redman: It is important to understand that the volume and variety of data are growing exponentially. MDM programs that recognize underlying causes of the issues that I mentioned before, and get in front of those issues, have a chance. Programs that don’t are in for tough times.

As for technology, of course companies will need a platform. But today, that should be the least of their worries.  As I noted before, they should start by developing a common language, even if it is only one definition at a time.  Doing so pays immediate dividends because people can work together across departmental lines.  As they get traction, companies can turn their attention to technological platforms.  This is a two-step process:  First language, then technology.  Doing it in two steps dramatically increases the probability of success. 

 

Be sure to check out Mr. Redman’s latest book, Getting in Front on Data: Who Does What, to understand how to bring your whole organization forward to get a collective grip on data. Because without quality data in hand, even the best technology won’t help.

Tom Redman, the “Data Doc,” helps companies, including many of the Fortune 100, improve data quality. Those that follow his innovative approaches enjoy the many benefits of far-better data, including far lower cost. He is the author of Getting In Front on Data: The Who Does What (Technics Publications, 2016) and Data Driven (Harvard Business Review, 2008). His articles have appeared in many publications, including Harvard Business Review, The Wall Street Journal, and MIT Sloan Management Review. Mr. Redman started his career at Bell Labs, where he led the Data Quality Lab. He holds a Ph.D. in Statistics and two patents