Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Master Data Management

No description

Delia Rodríguez Lucas

on 30 May 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Master Data Management

Commissioning organization:

• Author: Delia Rodríguez, deliarodriguezlucas@gmail.com
• Supervisor: Tapani Honkanen, Tapani.Honkanen@hamk.fi
• Representative: Kristiina Ranta, Kristiina.Ranta@hamk.fi

“Master Data is information that remains unchanged for a long time and is repeatedly required in business processes”

"Transactional Data can be changed -it is created, edited and valid only for a limited period of time for a special transaction only"

Olaf Schulz
Master Data
To improve the quality of Master Data, it is essential to understand what it means to its consumers:
Data Quality
Previous concepts
Master Data Management
The basic aim for companies is simply to create value, many times in the form of money and profitability. To reach that goal, they have to be efficient and consequently optimize every single process within their system. ERPs were conceived for this purpose –automate procedures and reduce operations, timing and costs.

The challenge increases when the magnitude of the database spreads and it becomes difficult to guarantee that the data is reliable, precise, appropriate, standardized and unique.
Master Data in a company

Identify sources of MD.
Identify producers and consumers of MD.
Collect and analyse Metadata about MD.
Design data responsible.
Establish a data-governance council.
Develop the MD model.
Select a toolset.
Design the infrastructure.
Generate and test the MD.
Modify the producing and consuming systems.
Implement the maintenance processes.

(Wolter & Haselden)
MDM Solution
Bachelor's Thesis
Industrial Management Engineering
Spring 2014
The main objective of this Final Thesis is to explain what is Master Data Management and how it affects to the overall profitability of companies.
Basic Rules for Master Data
• Unique Description: The description of Master Data should be unique for the entire entity, which means that every concept should be properly defined and accepted for all the departments.

• Common Hierarchy: Every person able to read/write Master Data should know its structure, in order to prevent mistakes and problems in finding the accurate information.

• Coherent details: All the Maser Data records should have the same level of detail, because it allows using different ways or shortcuts of search and selection over the data.

(Robert Hillard)
Wang and Strong compiled data from the current studies, company problems, etc., in order to get a realistic list of data characteristics
Sources or poor data quality
The whole process will be non-sense if the data that is going to be shared along the different areas is not correct.

Normally, the creation of Master Data does not imply to enter the data.
In reality, companies already have the core data but this data is not well managed –it is neither shared between the systems nor well maintained.
Master Data Creation
"Poor quality Master Data can lead to severe and harmful results on the health of a company. Specially when it is not identified and corrected early on, it can contaminate all downstream systems and information assets, increasing costs, affecting customer relationships, and causing imprecise forecasts and poor decisions.
Impacts of poor quality data
Costs of poor quality data
Master Data Management
Master Data Management - PRACTICE
Practical Case
It is not a punctual action, but rather a continuous process that needs to be well maintained over the time in order to ensure its reliability and quality.
Master Data Maintenance
Inadequate Management
Adequate Management
Thank you.
Delia Rodríguez Lucas
HAMK University of Applied Sciences
Industrial Management Engineering
Final Thesis
Spring 2014

“any collection of related data”
“a persistent, logically, coherent collection of inherently meaningful data”

“a repository for data”
How is its structure?
What is data?
Entity-Relationship Model
Entity, attributes, entity sets + relations
It attempts to integrate all departments and functions across a company onto a single computer system that can serve all those different departments’ particular needs
Enterprise Resource System
What are its benefits?
How is its structure?
“Data is of high quality if it fits for its intended use in operations, decision-making, and planning. Data is fit for use if it is free of defects and possess desired features.” (Redman)

A more practical point of view:

“The real concern with data quality is to ensure not that the data are perfect, but that they are good enough for the organization to make appropriate and reliable decisions” (Kerr, Norris & Stockdale).
• Ballou and Pazer divide data quality into four dimensions: accuracy, timeliness, completeness and consistency.

• Wand and Wang limit their focus to intrinsic data qualities, of which they define four intrinsic dimensions: completeness, unambiguousness, meaningfulness and correctness.
Are companies aware?
= inability to make decisions
(relay on intuition)
The total cost of poor data quality is around 8-12% of revenue range.
40-60% of a service organization’s expense is a consequence of poor data.
Other approaches:

Location Information
Supplier data
Customer data
(Dietze & Fischer)

Process costs
Opportunity costs
(Eppler & Helfert)
• Schema:
Naming: Homonyms (same name used for different objects) or Synonyms (different names used for the same object).
Structure: different representations of the same object in the sources in terms of constraints, tables, data types, etc.
• The Instance level: data conflicts (different value representations and different value interpretations).

Lack of validation routines (incorrect)
Lack of Integrity rules (inaccurate)
Lack of data cleaning (double data)
Changes in the structure.
Data conversion or migration.
"A combination of applications and technologies that consolidates, cleans, and augments this corporate master data, and synchronizes it with all applications, business processes, and analytical tools.” (Oracle)

MDM is a tool that links IT and business in order to generate and maintain accurate, reliable and consistent information across the whole company.
Consolidation: Master Data has to be identified along the whole company system, then it has to be consolidated, and finally located into a “map” known for all the company in order to unify analytics and reporting.
Harmonization: ensures that master data is synchronized across the different areas or departments of the company. This step basically extends the consolidated data, distributing the relevant parts along the system. Therefore, it enriches the client application systems with locally relevant information.
Central Management: maintenance, storing and distribution mechanisms of Master Data should be handle in a unified manner.

Our company is a Wholesale business that was established in Germany in 2001. We focus on food that typically is sold in supermarkets or groceries stores. Therefore, our activity is basically buying diverse kinds of food from different suppliers and selling them to Retail companies, normally supermarket chains. So far, we just perform our functions in Germany, although we are thinking about expanding abroad.

Four months ago the Sales Manager stated that the demand for chocolate bars was growing and the potential profits from buying and selling these products were very promising. Therefore, the Purchasing Manager has been inquiring from different possible suppliers and after some negotiations, he has exposed that the best option (according to the different vendors, plant distances and prices) is to buy Kik-Kat chocolate bars from the German office of Nestlé.

A business case will be exposed in order to show how much master data can affect a company in terms of profitability. The way this is going to be proved is through data maintenance, handling different processes.
Different good practices were used in Materials Management, Purchasing and Sales&Distribution:
Different errors in Materials Management, Purchasing and Sales&Distribution that led to profitability losses:
Every MDM project is customized to each company according to the concrete requirements and resources, but the common steps:
Does it have to be applied to the whole system?
The implementation of a MDM is a complex process long and expensive = impedes an attractive ROI.

Therefore, sometimes a better approach is to implement the MDM project in a phased way.
Hybrid implementation: data stays where it currently resides (no consolidation in one master piece).
Develop an effective working model.
Allows a good ROI.
Segmental implementation: Solution in one area of the company (Customer):
The most used area.
Expand taking advantage of the previous plan.
Intelligence to improve future business processes
Cleaning and standardizing = transforming it into the arranged model of Master Data. It is crucial to understand the model in terms of attributes, contents, and mapping of each source to the final Master Data system.

Similar to the Extract, Transform, and Load (ETL) process that is used in data warehousing.

• Normalize data formats: all the data has the same structure.
• Replace missing values.
• Standardize values using common or standard measurements.
• Map attributes.

Aims to eliminate duplicates that appear when two or more sources are merged. It is a balance between “false matches” and “missed matches”, where false matches lead to lose data and missed matches reduce the value of maintaining a common list.

The way this process works is through a tool that calculates a confidence factor for each match, taking into account the number of attributes match-ing and the proximity of the match.

This degree of confidence is compared to a given limit that is adjusted according to the importance or consequences of a false match.
The higher degree of confidence, the higher likely to surpass the limit and create a new match.
Data responsible or steward should be the main person involved. Normally a business person who has knowledge of the data, can recognize incorrect data, and has the knowledge and authority to correct the issues.

• Reviewing Master Data creation, the matching between different Master Data records. It is important to check the cases where the data match criteria were close but below the threshold.
• Pulling changes and additions into the MDM system, and to distribute the cleansed data to the required places.
• MDM infrastructure should include tools that help the data steward to recognize issues and simplify corrections. For instance, a program whose objective is to point out questionable matches.
In order to understand and prove the information presented in the first part of this Thesis, a Business Case will be put into practice. In this way it will be easier to understand all the concepts that have appeared throughout this paper, and check what they imply –how they affect business profitability.
Feedback from Supplier and Customer.
Shared databases, questionnaires.
Data responsible/two employees.
Agreed fields.
Communication within the company.
Data updates.
Information in non-mandatory fields
(SS, Storing, GR processing).
Entering-data employee education.
Adequate document flow.
Comments and reasons.
Documents numbers.
Non-mandatory information (GR, warehouse, del. note).
Standard procedures.
Concrete information (receipt hours, unloading points).
Update links (price condition).
Materials Management
Purchasing and Sales&Distribution
No Feedback from Supplier and Customer (Incoterms)
No Data responsible/two employees (error with taxes).
No communication within the company.
Only compulsory fields (contact, unloading point, GR).
Misspelling errors (address).
Lack of entering-data employee education (warehouse).
Inadequate document flow.
No comments or reasons.
Documents numbers.
Mandatory information (GR, warehouse, del. note).
No standard procedures.
Lack of concrete information (receipt hours, unloading points).
No updates (price condition).
Materials Management
Purchasing and Sales&Distribution
The purpose of this Thesis was to explain from the pillars what is this issue, how it is linked to companies, how it affects the business process and finally which solutions exist to prevent and solve its subsequent problems.
In the Adequate Management: a proper entering prevents many unexpected problems and therefore keeps the system robust. Nevertheless, “good management” implies an effort. The objective should not be to have a perfect process, but rather an adequate process aligned with the business’ requirements.

In the Inadequate Management: how little mistakes could trigger huge problems. The profitability losses can come from diverse sources, not only purely monetary –a waste of time, or a dissatisfied customer also drive to similar results.

Note: just single-source at the Instance level problems.
Economic results, IT solution.
76% data entries: "a priori" method should be studied.
Most of the solutions are purely technical. Correct frame?

My opinion is that the solution should not be that concentrated in the technical aspect and spread out covering both angles.

A system in order to ensure that the data in inserted correctly into the system should be implanted. Not only regarding terminology, hierarchy, important fields, etc, but also taking care of the required trainings and courses to make sure that the employees are aware and educated in the topic.
(Rahm & Hai Do)
(Shang & Seddon)
(Uwizeyemungu & Raymond)
(Wang & Strong)
Full transcript