Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Master Data Management
Master Data Management
Master Data Management
Ebook221 pages5 hours

Master Data Management

Rating: 0 out of 5 stars

()

Read preview

About this ebook

In 1941, a new term was added to the Oxford English Dictionary: Information Explosion (Press, 2013). The term explains the growth in information content seen seven decades ago, beginning with Fremont Rider; a university librarian who in 1944 estimated that information in university libraries would double in size every sixteen years. Nearly seventy year later, Bounie and Gilled, produced a report called “International Production and Dissemination of Information”, and concluded that the world in 2008 produced 14.7 exabytes of new information; three times the amount of information produced just five years earlier (Press, 2013). This rapid growth in information content has generated a greater need for organizations to evaluate how key organizational content is managed to achieve strategic goals and to remain competitive in today’s business environment.

LanguageEnglish
Release dateJun 30, 2017
ISBN9781370422890
Master Data Management
Author

Binayaka Mishra

Binayaka Mishra is an experienced IT professional, in various tools and technologies like Data Warehousing, BigData Analytics, Cloud Computing, Reporting Analytics & Project Management documentation with 14+ years’of experience. He was Graduated in Computer Science & Engineering from National Institute Of Science & Technology, Berhampur, Odisha, India in 2002.He has worked in several critical roles with MNC’s like Tech Mahindra, Oracle Corporation, Wipro Technology,CapGemini UK,CapGemini India Pvt Ltd, UBS , AoN Hewitt Associates India Pvt Ltd, HMRC -UK and TUI Travel Plc -UK. Apart from technical details, his mastery are into functional domains like Payroll Processing, Tax Calculation, UK NI, BFSI,Telecommunication, Corporate Tax measurements divisions, Investment Banking, Automotive, Asset management , Security and Travel & Tourisim.Currently working as a Solution Architect / Project Manager in Tech Mahindra, India, loves to listen to music, play snooker, Bowling and a desperate swimmer like a shark.More Information could be found about him in his Linkedin Profile : https://www.linkedin.com/in/binayaka-mishra-b09612142/For any comments or advise, please feel free to write to: mishra.binayaka.18005@gmail.com

Read more from Binayaka Mishra

Related to Master Data Management

Related ebooks

Enterprise Applications For You

View More

Related articles

Reviews for Master Data Management

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Master Data Management - Binayaka Mishra

    Chapter 1: Shaolin Tale

    When the matter comes about KungFu, what comes first in all our mind is Shaolin Masters those who responsible to create the most devastative and disciplined method on the world since ancient civilization to combat the odds in the life. If we will go down one step further from the pit of the Shaolin Temple where KungFu was actually invented, it originated and was developed in the Buddhist Shaolin temple in Henan province, China. During the 1500 years of its development, Shaolin kung fu became one of the largest schools of kung fu. Likewise, MasterData Management AKA MDM is the technology discovered by, CDI [Customer Data Integration], ERP[Enterprise Resource Management] & PLM[Product Lifecycle Management] being the masters of data management.

    In 1941, a new term was added to the Oxford English Dictionary: Information Explosion (Press, 2013). The term explains the growth in information content seen seven decades ago, beginning with Fremont Rider; a university librarian who in 1944 estimated that information in university libraries would double in size every sixteen years. Nearly seventy year later, Bounie and Gilled, produced a report called International Production and Dissemination of Information, and concluded that the world in 2008 produced 14.7 exabytes of new information; three times the amount of information produced just five years earlier (Press, 2013). This rapid growth in information content has generated a greater need for organizations to evaluate how key organizational content is managed to achieve strategic goals and to remain competitive in today’s business environment.

    For more than a decade, organizations have adopted a number of different approaches to data integration; from Data Warehousing in the early-to-mid 1990s, striving to achieve informational integration, through to ERP in the mid-to-late 1990s, focusing on operational (process and data) integration. Organizations have expected enterprise technologies to provide real tangible business benefits, with buzzwords like ‘integration’, ‘collaboration’ and ‘optimization’ proposed to ensure definite success. As a result, organizations around the world invested billions in Data Warehousing and ERP initiatives specifically; unfortunately, this confidence in technology was misplaced, where only a very small number of implementations were successful. We argue that the most important factor for the emergence of MDM has been the unrealized benefits in previous ERP implementations and unresolved Informational IS requirements. Indeed, these previous approaches to integration have facilitated the emergence of MDM which is set to define the organizational landscape for the next five years or so (a fashion cycle) as the solution to the data and information integration problem. In the following sections we present a brief historical account of organizations’ approaches to data integration, namely: Data Warehousing, ERP and ERPII/BI.

    Reflecting on the early-to-mid 1990s Data Warehousing can be described as an informational solution to an operational problem in terms of data integration. The limitations of the traditional Management Information Systems (MIS), perceived as being unable to maintain a consistent view of an organization’s reconciled data, was identified as the potential benefit of a Data Warehousing system. To overcome the problems with traditional approaches of accessing large amounts of data in heterogeneous, autonomous distributed systems, the emergence of Data Warehousing introduced the concept of a ‘logically centralized data repository’. Therefore, the concept of Data Warehousing emerged due to the evolution of IS objectives within organizations to the growing demand within organizations to analyze (internal and external) business information.

    1.1. 1995 to 2000, Similar to the experiences with Data Warehousing, there was no agreed definition for ERP systems, although their characteristics position these systems as integrated, all-encompassing, complex mega-packages designed to support the key functional areas of an organization. Therefore, by design, an ERP is an operational-level system. By the mid-to-late 1990s ERP systems vendors provided an alternate operational solution to the data integration problem, retiring the previously existing fragmented legacy systems that operated throughout the organization. Furthermore, ERP systems also promised to deliver on the informational requirements of an organization, such is its scope, therefore, the perceived need and along with it, the rate of Data Warehousing project implementations, was reduced. Due to the fact that an ERP systems implementation replaced many of the legacy systems throughout the organization, it can be perceived as the ‘base line application’, containing integrated application data, generated as a ‘by-product of transaction processing’, or as an ‘ODS’ (Operational Data Store), a ‘hybrid structure’ that contains some aspects of a data warehouse and other aspects of a transaction processing environment. Many research studies of ERP implementations have reported how the failure to properly analyze requirements and understand the impact of the changes brought about by ERP implementations has created problems for implementing organizations and has curtailed the extent to which they have been able to derive benefits from their investments. As organizations moved toward the post-implementation phase of their ERP projects, post Y2K for the vast majority of organizations, the real issue of benefit realization emerged. Pallatto added that concessions and compromises in the design of the rushed Y2K ERP projects had negative impacts on systems performance and benefits which were not promptly and fully communicated to the implementing organization.

    1.2. 2000 to 2005, One benefit in particular which did not materialize was the provision of an integrated informational platform to facilitate reporting on every aspect of an organization’s activities. This led organizations to reconsider undertaking Data Warehousing projects post-ERP implementation. Therefore post-Y2K, many organizations discovered that the solution to leveraging investment decisions in, and retrieving useful data from, an ERP system was to undertake additional initiatives, for example Data Warehousing; ERP II initiatives embracing the concept of PIM (Product Information Management) and CDI (Customer Data Integration); and Business Intelligence, in conjunction with their already implemented ERP system. Indeed, Ventana Research highlight the fact that over half of the organizations considering MDM have already implemented a PIM or CDI master data deployment. The harsh reality of ERP systems implementation, to the expense of those organizations that invested resources in the initiative, is that ERP only facilitated getting data into the system; it did not prepare data for use and analysis. This is due to the fact that ERP systems lack certain functionality and reporting capabilities. Many organizations experienced frustration when they attempted to use their ERP system to access information and knowledge. It was quickly realized that ERP systems are good for storing, accessing and executing data used in daily transactions, but it is not good at providing the information needed for long term planning and decision making as ERP systems are not designed to know how the data is to be used once it is gathered. As we have argued earlier, this has led to the emergence of the Master Data Management (MDM) concept.

    To harness the Enterprise data integration which is the integral part of CDI and avoiding critical mistakes as involved with it, it is important that any enterprise concerned with the integrity of their customer data review the eight critical components of a customer data management environment & they are:

    i. Business-driven accuracy definitions and thresholds

    ii. Data investigation and analysis

    iii. Comprehensive conversion plans – and dress rehearsals

    iv. Symmetrical update routines for batch and online worlds

    v. Preventative Maintenance

    vi. Data Audits

    vii. Enhance customer data

    viii. Data Stewardship

    To hone the data quality with data governance, the organizations further harness the CDI technology with below applied conditions:

    1.Does data quality have executive-level sponsorship?

        A.Board level

        B.Other

        C.None

    2. Do you have established accuracy definitions and thresholds?

        A.Enterprise basis

        B.Project basis

        C.None

    3. Do you conduct regular data quality audits?

        A.Internal & external

        B.Internal only

        C.None

    4. Do you have common data entry standards?

        A.Across enterprise

        B.Within business line

        C.None

    5. Is data quality awareness part of new staff induction?

        A.All staff

        B.‘Relevant’ staff only

        C.None

    6. Do you have a dedicated data quality staff?

        A.Team

        B.An individual

        C.None

    7. How is your data quality budget handled?

        A.Separate major budget line

        B.Separate line in each project

        C.Ad hoc funded from project

    Total Scoring:

    A = 3 points

    B = 2 points

    C = 1 point

    How Does Your Organization’s Data Quality Commitment Stack Up? If you scored 17 or higher, congratulations! You’re among the leaders in taking advantage of the strategic resource that your customer data represents. If you scored between 10 and 16, your organization has taken important steps towards protecting the value of your customer data, but there remains some more work to be done. Your organization’s marketing initiatives and strategic decision-making are probably at risk due to faulty data. If you scored below 10, you should consider strategies for achieving executive buy-in to the importance of data quality. Without a significant change in how your organization views and manages its customer data, you risk losing ground – and customers – to competitors. To achieve a sustained cultural commitment to data quality, you have to be able to justify the investment in real financial terms:

    i.Hard ROI

    ii.Cost savings

    iii.Reduction in operational risk

    By connecting the definition of quality and measurement scale to how the data is used, i.e., whether it’s for call centre, marketing, risk management or management information, you can identify or estimate the value of certain customer-based events, such as:

    i. The cost of loyal customers who are lost because your call centre didn’t recognize them or didn’t have the right information to properly service them.

    ii. The cost of sending duplicate mailings to the same customer or household, including production and postage – and multiplying that over years of multiple mailings.

    iii. The money that can be saved by preventing the over-extension of credit to a customer who deals with several different departments of your organization or makes purchases under different aliases.

    iv. The value of accurate customer information in making real-time pricing decisions for customers who expect one-to-one personalization.

    v.The value of protecting your corporate brand and avoiding costly fines by preventing compliance violations that result from conducting business with a person or business – or their respective    aliases – that appear on one or more government sanction lists.

    1.3 January 2005, the data quality implementation plans as furnished above for CDI resulted:

    i. The evaluation process does not focus attention on identifying the client’s business-driven data quality needs, and how each vendor’s offering relates to these needs

    ii. The standard list of features and functions included in the qualifying questionnaire may have little to do with how the client will actually use the data quality software selected

    iii. During the proof of concept, the competing vendors’ solutions are often ‘tested’ by using a sample of the client’s data that is not valid in size or composition to produce reliable and meaningful results

    iv. The data quality software is implemented on a project-level basis and typically does not take full advantage of the robustness of the selected solution’s enterprise capabilities

    Following are important ways that a specialist can strengthen software selection and implementation process:

    1.  Focusing the Process on the Client’s Needs With this expertise, the specialist can help focus the evaluation and selection process on the client’s own situation, in terms of:

    i. How the data quality solution will be used

    ii. Appropriate accuracy levels to be achieved to meet the client’s real business needs

    iii. IT resources required for a robust implementation and on-going maintenance

    iv. Realistic implementation schedules in relation to the client’s own time constraints

    2. Generating Creative, Client-Focused Problem Solving:

    To take the client-focused approach a step further, the data quality specialist can position part of the RFP as a challenge based on the client’s unique data quality needs and objectives (the bulleted items listed under #1). Each vendor will be asked to present their best solution for maximizing the client’s immediate and on-going data quality performance, within the given parameters.

    3. Selecting an Appropriate Sample for Testing.

    4. Providing a Robust Implementation. The data quality specialist can help to ensure that:

    i.The implementation is robust and takes full advantage of the software’s functionality and capabilities in relation to the client’s needs

    ii.Appropriate data quality accuracy levels are established, based on the data’s business uses

    iii.A program for regular data quality audits and on-going maintenance is established

    5. Key Questions for Selecting a Data Quality Vendor:

    i.Determining how the data quality suite will actually be used

    ii.Defining the accuracy levels that will be needed to meet the client’s business needs

    iii.Profiling the data to determine existing quality levels and identify potential problem areas that must be addressed

    iv.Performing a test run using an appropriate statistically significant data sample prior to the full integration)

    In Summary, A customer data conversion can be considered successful only if the resulting data meets the needs of both its Business users and IT team. To help achieve this objective, both groups should be included on the conversion planning and implementation team. To help ensure that the conversion is delivered on-time and on-budget, the project should include 6 critical steps:

    (1) select an experienced data conversion consultant to lead the project;

    (2) use an automated data profiling tool to thoroughly investigate the data;

    (3) perform a dress rehearsal using small, statistically significant data samples;

    (4) based on the results of the dress rehearsal, update the project estimates and projections;

    (5) conduct a large-volume conversion test;

    (6) run the conversion.

    By working together, the Business and IT users are very nearly guaranteeing a successful

    Enjoying the preview?
    Page 1 of 1