Driving ROI with Master Data Management, Part III: Project Iteration

ABSTRACT: This final blog in our series on the ROI of master data management recommends ways for data teams to iterate their MDM initiatives based on the successes and failures of their first project.

Sponsored by Semarchy

The United Nations has worked hard to help its multilingual members understand one another. The UN started with five official languages in 1945, added Arabic in 1973, and made numerous changes this century to further standardize the communications of geopolitics. 

It’s a similar story with master data management, which standardizes the communications of business: you learn as you go and make course corrections to improve efficiency over time.

MDM defined

Master data management is a program that helps business professionals understand one another. Its practices and tools improve the accuracy and consistency of records that describe business entities such as products, customers, and partners. MDM matches and merges data across systems to create standard attributes and terms, eliminate duplicates, and resolve discrepancies. The resulting “golden records” strengthen data governance programs and serve as a single source of truth for the business.

This blog concludes a series that examines the return on investing in MDM. The first blog helps prepare a business case for MDM according to its impact on the risk, time, and resources required for data processing. The second defined ways to select the right architectural approach and measure the ROI of that critical first project. This third and final blog recommends ways for data teams to iterate their MDM initiatives based on the successes and failures of project 1.

Successes

We start by considering three primary success factors: organizing stakeholders, choosing the right architectural approach, and applying artificial intelligence. (For details, also read this Eckerson Group report by Joe Hilleary about MDM best practices, and these case studies from our sponsor Semarchy.) Data leaders must assess these factors for project 1 so they can replicate and amplify them in projects 2 onward.


Data leaders must assess how they organized stakeholders, chose their architectural approach, and applied artificial intelligence in their first MDM project 


1. Organize stakeholders. A successful MDM project enlists, motivates, and tracks stakeholders' contributions including the executive sponsor, project manager, data steward, data engineer, and data consumer. As a data leader, you should assess how well project 1 organized its stakeholders. Which individuals, tasks, or handoffs made the biggest contribution to project success? Identify these elements and evaluate their impact on ROI in terms of risk, time, and resources. Then, you can replicate and amplify those elements in project 2. For example, suppose a B2B organization made its data stewards 10% more productive with a gamification program in which they compete to resolve conflicting business records. That’s a technique for other functional teams to replicate in projects 2 onward.

2. Choose the right architectural approach. As explained earlier in our series, a successful project implements the MDM approach—registry, consolidated hub, centralization, or coexistence—that best fits the business reality of that organization. Assess how well your project 1 aligned with business reality. Perhaps your autonomous division in Eastern Europe boosted efficiency by synchronizing its master data with the global registry each month. If so, look for other autonomous divisions to do the same in project 2. Or perhaps your consolidated MDM hub created golden records that enabled your finance department to reduce compliance risk. If so, consider how your sales department can do the same in project 2. 

3. Apply artificial intelligence. Machine learning makes MDM more efficient and scalable by matching records, detecting anomalies, and generating repeatable rules. As a data leader you should encourage your team to implement an ML-assisted MDM tool in project 1, then measure the results. Did the ML algorithms reduce the risk of errors or accelerate the matching and merging process, and if so how? Identify the tool features, use cases, and expert users that delivered results in project 1. ML-assisted MDM works best with high volumes of business records with low variability and minimal regulatory requirements. The more humans reduce the time they spend matching and merging records like these, the better the ROI for project 2 onward.

Failures

In technology and life, failures hold the most valuable lessons. Data leaders must take a hard look at what went wrong in MDM project 1 to avoid the same or worse results in project 2. Let’s consider common failures that derail our success factors. You can find these failures by looking at disappointing results—perhaps compliance issues, missed deadlines, or duplicative/conflicting records—and working backward from there to pinpoint the root cause. 


Take a hard look at what went wrong in project 1 to avoid the same or worse results in project 2


  • Stakeholders. It’s easy to get the human factor wrong in project 1. Did you fail to engage key stakeholders, motivate them, or get their managers to allocate sufficient time for new MDM procedures? The most common human failure is participation. Without an explicit incentive, the average employee will not create or use a golden record unless that helps him do the job they are paid to do. Find the MDM non-participants and fix their behavior with the right sticks and carrots.

  • MDM approach. The four architectural approaches to MDM exercise varying degrees of control over enterprise teams. Did your centralized hub exert too much control over functional teams and break their processes? Or perhaps your registry exerted too little control and teams ignored it, creating more duplicative and conflicting records than ever. You can reduce such problems in project 2 by adjusting incentives, procedures, or the architectural approach itself.

  • AI/ML. ML-assisted MDM fails when it automates erroneous matches and merges. If this happened in project 1, you have two options to fix it. First, you can turn off the feature so that humans make all the judgment calls, eliminating risk but increasing time requirements. Second and better yet, you can have a domain expert and possibly a data scientist assess the root cause of those project 1 errors. In all likelihood, you can fix the problem by recalibrating the rules that the ML algorithm puts into place.

The urgency of MDM

The challenge is that data teams might not get a second chance. As a data leader you must generate quick, repeatable value with project 1 to secure executive support and budget for project 2. This requires your team to start small and demonstrate that, in the words of Randy Bean, you will “fail fast, learn faster.” You can achieve this by shining a bright light on the success factors of team organization, architectural approach, and AI before and after every project—delivering a strong ROI in the short and long term.

Kevin Petrie

Kevin is the VP of Research at BARC US, where he writes and speaks about the intersection of AI, analytics, and data management. For nearly three decades Kevin has deciphered...

More About Kevin Petrie