Master Data Management – Data Velocity

Worldwide master data management (MDM) software revenue will reach $1.9 billion in 2012, a 21 percent increase from 2011, according to Gartner, Inc.  The market is forecast to reach $3.2 billion by 2015.  Research by Dell ‘Oro group is predicting that the average storage requirements for fortune 1000 companies to grow from an average of 1.2 petabytes in 2011 to 9 petabytes in 2015.

Will the increased spending on MDM software be enough to manage the increase in data?   The good news is that it probably will.  One of the key aspects of MDM is to clearly define the data entities in clear unambiquous terms.  In addition, MDM also encourages a focus on having business processes to manage the data entities for accuracy.

This sounds good but you run the risk of investing a lot of money and not getting the return you expect unless clear objectives for the MDM project are established.   MDM was first introduced to address data quality issues and manage a single view of a customer or product. A successful MDM project clearly defines the business process, engages the appropriate people as data stewards and implements a tool set for managing the data (i.e.  People, process and technology).  In addition, a clear scope and definition of the business entities to be managed.   Without a clear definition of scope and purpose, the project will not deliver the desired business value.

Gaining agreement on scope and purpose is critical for success.   The purpose of the project includes the key performance indicators, KPI, used to  to measure the project success.   One simple KPI might be ‘percent reduction of returned mail.’  The candidate entity would be ADDRESS and MDM would be used to standardize addresses to facilitate identifying bad addresses, correcting addresses and managing  address changes.   Going through the effort of getting clean addresses is valuable to the extent this quality data can be leveraged across all systems.

Interestingly, a number of techniques used to load data warehouses can be managed by the MDM solution.   The first step when loading a data warehouse involves standardizing the data with ‘cross walk’ tables for codes (e.g. product code, risk codes).  A typical data warehouse goes through the process a cleaning the data to provide a single view of a company’s data.   The challenge is the timeliness of the data.

A data warehouse is going to lag behind and have data slightly older than what is in the online transactional systems.   The lag is caused in part because of the time it takes to transform data into the structure the data warehouse expects.  Employing an MDM can shorten the time it takes to make data available by managing a common set of IDs across systems.  This MDM ID can be used to load data of common entities into a single data store more easily.

Address, Customer, Vendor and Product are a few good candidate entities for an MDM project.  Address, as mentioned above, has a very tangible KPI, postage costs.   The value of managing Customer, Vendor and Product with an MDM project may be a little more difficult to measure.  Better management of these data elements would definitely help a company manage customer demand, understand costs and provide customer service.

A broadcast, Change Artists, caught my attention a number of years ago and I saved the Change Artist Transcript from the web.  Nestlé’s CEO Peter Brabeck and Deputy Executive VP Chris Johnson were interview.  I think the interview was around 2006.   The embarked on a worldwide project they call GLOBE with the objective of improving their data.  MDM requires a commitment of the right people, a solid process and systems that support the people and process aspects of the final solution.

The interview provides some great insight into how Nestlé improved the quality of their data.  Here are a couple of interesting comments from the project.

Well, the GLOBE Project, overall, is really not an IS/IT project. I think if it was I don’t think he would have chosen me to run it, because certainly that’s not my background or my expertise. It’s really a business initiative. – GLOBE stands for Global Business Excellence.

Steve Johnson

It’s clear that the business has set clear expectations for the project.

The first one was to implement harmonized best practices. So in other words, to take the best ways in doing business around the Nestlé world and share them. Very basic. The second one, which is not very sexy, but very important, is to standardize our data and our approach to data management; to treat data as a corporate asset. And the third one is to then support this and enforce it through standardized systems and technology. So it’s harmonized best practices, standardized data center, and systems.

Steve Johnson

According to the interview, these objectives were know across the entire company worldwide!   About 80 percent of the company was impacted.  The interview indicates the project spans 100,000 users across about 1200 locations of  factories, distribution centers, and sales offices.

They highlighted very tangible results from their GLOBE program.

Because quite honestly, we thought before the GLOBE program that we had 6 million customers, vendors, and materials. At least that’s what was in our systems. After going through the GLOBE exercise of cleansing, we realized that, well, about two-thirds of that was duplicate, obsolete, or just plain wrong.

Steve Johnson

Here is a great example of successfully improving data quality and company operations by implementing a project focused on data as an asset.  As evidenced by Nestlé, success is dependent on engaging the right people and having clear objectives.

Conclusion:

The original question was

Will the increased spending on MDM software be enough to manage the increase in data?

The original answer was ‘probably will’.  But seems to me, technology alone will not solve the problem.   Without engaging the business and having the right objectives, an MDM project is just another expensive IT endeavor.   MDM run from a business perspective to ensure delivery of business value is critical so the investment in the MDM projects will be worth IT.

In addition, the problem is not data volume but instead it is data velocity – making data available more quickly to the the business.  MDM speeds up the process of managing quality data and provides context for loading large volumes of data.  Once a quality MDM solution is in place, the increase in data volume can be handled with a scalable MDM solution.

This entry was posted in Data. Bookmark the permalink.