Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Managing Blind: A Data Quality and Data Governance Vade Mecum
Managing Blind: A Data Quality and Data Governance Vade Mecum
Managing Blind: A Data Quality and Data Governance Vade Mecum
Ebook140 pages1 hour

Managing Blind: A Data Quality and Data Governance Vade Mecum

Rating: 0 out of 5 stars

()

Read preview

About this ebook

The literal translation of the Latin vade mecum is “go with me”, a small reference book that you carry with you. Managing Blind is a small, easy to read guide to the real life challenges of data quality and data governance. With over thirty years of experience working in six of the seven continents (he has not made it to Antarctica yet) and across a wide range of industries from agriculture, mining, manufacturing, processing, transportation, banking, finance, insurance, and healthcare, Peter has come to recognize that while the scale of the challenges and opportunities may vary, the fundamental characteristics of data quality are the same.

Can you imagine how much attention you would have received if you had proposed a data quality or data governance program twenty years ago? Yet by the late 80’s it was already clear that all was not well in the data world. Can you imagine a hotel chain using your social security as your rewards membership number, yet a very well known hotel chain did exactly that for many years? While the Y2K bug never materialized, by the turn of the century more and more businesses were now totally reliant on their computer systems and the cost of missing or incorrect data was being measured in the millions. As computer systems become interconnected and their speed increases managing blind is an increasingly risky option. As Peter explains: “The difference between an actuary and a gambler is data. The actuary promotes their ability to record and analyze data and the gambler must hide any such ability or risk being asked to leave the casino.”

How would you explain the loss of $125 million in 1999 due to a simple mistake of a unit of measure, try explaining a loss of somewhere between $2 and $5 billion today because of an inability to effectively monitor risk. There is one thing we all agree on; missing or wrong data increases risk and masks opportunity. In this easy to read book, Peter draws on his unique experiences to provide an engaging insight into practical solutions that can be applied by all managers.

LanguageEnglish
PublisherPeter Benson
Release dateAug 2, 2012
ISBN9781476045399
Managing Blind: A Data Quality and Data Governance Vade Mecum
Author

Peter Benson

Mr. Peter Richard Benson is the Founding and Executive Director of the Electronic Commerce Code Management Association (ECCMA). The international association was founded in 1999 to develop and promote the implementation of co-operative solutions for the unambiguous exchange of information. Peter has enjoyed a long career in data driven systems starting with early work on debugging Vulcan the precursor of what became dBase, one of the very first relational database applications designed for the personal computer market. Peter went on to design WordStar Messenger, one of the very first commercial electronic mail software applications which included automated high to low bit conversion to allow eight bit word-processing formatting codes to pass through the early seven bit UNIX email systems. Peter received a British patent in 1992 covering the use of automated email to update distributed databases. From 1994 to 1998 Peter chaired the ANSI committee responsible for the development of EDI standards for product data (ASC X12E). Peter was responsible for the design; development and global promotion of the UNSPSC as an international commodity classification for spend analysis and procurement. Most recently, in pursuit of a faster, better and lower cost method for obtaining and validating master data, Peter designed and oversaw the development of the eOTD, eDRR and eGOR as open registries of terminology, data requirements and organizations mirrored on the NATO cataloging system. Peter is also the project leader for ISO 8000 (data quality) and ISO 22745 (open technical dictionaries). Peter is recognized as an expert on the creation, maintenance and distribution of master data, and the automatic rendering of high quality multilingual descriptions from master data that are at the heart of today’s ERP applications and the high speed and high relevance text search engines that we have come to depend on. Peter is a proponent of open standards for data portability and long term data preservation. Peter works to focus international attention on open metadata and how its use in software applications protects an organization’s rights to their own data as well as on the importance of data provenance, the ability to track the origin of data.

Related to Managing Blind

Related ebooks

Business For You

View More

Related articles

Reviews for Managing Blind

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Managing Blind - Peter Benson

    Managing Blind

    A Data Quality

    and

    Data Governance

    Vade Mecum

    By Peter R. Benson

    Project Leader for ISO 8000, the International Standard for Data Quality

    Edited by Melissa M. Hildebrand

    rev 2012.08.02

    Copyright 2012 by Peter R. Benson

    Smashwords Edition

    ECCMA Edition License Notes:

    This eBook is licensed for your personal enjoyment only. This eBook may not be re-sold or given away to other people. If you would like to share this eBook with another person, please purchase an additional copy for each recipient. If you’re reading this eBook and did not purchase it, or it was not purchased for your use only, then please visit eccma.org and purchase your own copy. Thank you for respecting the hard work of this author.

    ***~~~***

    Table of Contents

    Preface

    Basic principles

    Chapter 1: Show me the money

    Chapter 2: The law of unintended consequences

    Chapter 3: Defining data and information

    Chapter 4: The characteristics of data and information

    Chapter 5: A simplified taxonomy of data

    Chapter 6: Defining data quality

    Chapter 7: Stating requirements for data

    Chapter 8: Building a corporate business language

    Chapter 9: Classifications

    Chapter 10: Master data record duplication

    Chapter 11: Data governance

    Chapter 12: Where do we go from here?

    Appendix 1: Managing a data cleansing process for assets, materials or services

    Further readings

    ***~~~***

    Preface

    As Ray Charles, Stevie Wonder and many other exceptional people have demonstrated, being blind is not an impediment to greatness. Some companies seem to be able to survive and even prosper without any meaningful data to guide them, so why is data and the quality of data is so important? When I first moved into our new house, we started a vegetable garden; it was hard work. When we went away for a week, we came back to a beautifully mowed garden where, thanks to our local deer population, there was not a single plant of more than 5 cm tall. Still we did not give up, but eventually the weeds had won. The weeds grew faster than we could keep them down, or at least faster than the amount of effort we were prepared to invest. Faced with a large collection of unplanted seeds, we put them all in one bucket and threw them over the garden. They grew surprisingly well and we called it our, Magic Garden. We would forage through the tall grass and weeds to find an abundance of carrots, onions, squash and ripe melons; this was fun and a lot less work. My Master’s degree is in Agricultural Economics so I know that even if the land and the labor was free, the cost of the seed probably exceeded the market value of the product so it clearly would not go far as a for-profit enterprise.

    In fact, my first assignment as an agricultural consultant was to recommend how to increase the profitability of a farm owned by a very wealthy landowner in England. For five years, I walked around the farm and reviewed the accounts and could see the obvious changes that would inevitably increase their profit. As was befitting the occasion, lunch was served in the dining room. It was a beautiful setting overlooking an impeccably manicured farm and we were waited on with style. I should have known better, but as an eager young consultant I was excited to share my findings. My first suggestion was obvious; the cattle were grazing beautiful grass growing on the rich soil in the flat fields below the Manor house while large tractors were struggling to plough much poorer soil on the slopes leading up to the Manor house. My first suggestion was to grow grass on the slopes and plough the flat field. This was met with an indigent plough up our steeplechase course, are you crazy? Not a good start, but not deterred, my second suggestion was to reduce the labor count which was extremely high for the size of farm. This was met with equal if not more derision you want me to fire my driver, my cook and my gardener, what is wrong with you. At this point I was beginning to feel like Galileo before the Holy Office of the Catholic Church having to repent and agree that the earth was the center of the universe and the sun did in fact revolve around the earth. Just as Galileo is rumored to have whispered Eppur si muove- and yet it moves, I made a mental note that wealthy landowners were not as interested in profitability as they were in having their personal expenses classified as business expenses.

    My early experiences in agricultural consulting served me well as not only is agriculture an environment rich in data but farmers are known for their practical approach to managing what are, in fact, large and extremely complex businesses.

    It is true that companies can succeed with what appears to be very poor quality information based on poor quality data, but as we have seen in the banking and insurance industry, it is not really a sound strategy in the long term. The market rewards efficiency and today more than ever before, efficient operations are heavily dependent on access to relevant, timely and accurate information. I have always liked the saying that when you're fighting alligators it is hard to remember that you were hired to drain the swamp. But there is also a flip side in that it is hard to pay attention to improving business efficiency when you can sell everything you make and your market is not price sensitive.

    It is also important to remember that data is a business tool and before you can use it effectively you must understand the business. Knowing what data to collect, when to collect it and how to use the data is actually more important than the data itself. Let me give an example.

    It has been a long time since I have worked in the agricultural industry however, on a recent visit to Africa I was accompanying a farm owner as he made his rounds. We stopped briefly at the milking parlor where milking was in full swing. The farm manager proudly showed me how each of the three milking staff were painstakingly recording in a book the cow’s tag number followed by the milk quantity and the amount of feed given. I was told that this data was then entered into a computer program and analyzed. It looked impressive until you realized the quantity graduation marks on the milk jars had long disappeared and the feed was simply dispensed by means of a plastic scoop and what was left uneaten by one cow just accumulated until the trough was full and required no further scoops. There was also no attempt to reconcile individual data to total feed used or total milk collected. Milk production revenue is determined by the quantity and the quality; the butterfat percentage and the cleanliness of the milk measured by somatic cell count. Milk is stored on the farm in refrigerated tanks and collected daily. The quantity is measured as the milk leaves the farm before it gets added to the collection tank. Samples are also collected so that the quality of the milk can be measured in a laboratory. The butterfat and somatic cell count are reported back to the farmer daily and the price of milk is adjusted accordingly.

    The quantity of milk a cow will give on any given day is determined by its food intake, its genetic potential and most importantly where it is in its reproductive cycle. The quantity of milk follows a very predictable and well known lactation curve. Again I know this not only because it is part of my agricultural training but also as a consultant I had to predict the total herd yield in order to calculate the cash flow for a number of dairy farmer clients and I did this by hand, without the aid of a spreadsheet. Consulting for dairy farmers is actually a bad idea unless you like getting up very early, as they finish milking at six in the morning and want to talk.

    Now, back to our farm in Africa. Butterfat is determined by genetics and diet so the amount and the composition of the cake fed during milking is an influencing factor but only in the context of the whole diet. It takes a lot of very accurate data and enormous skill to improve butterfat, and in the end it really is all about genetics. So, my solution if you want to increase butterfat is to add a couple of Guernsey or a Jersey cow to the herd.

    Onto cleanliness; the somatic cell count now, that really is something that is in the direct control of the operator. Clean the cows before you milk them, disinfect them afterwards, treat cows infected with mastitis and mark them so you never allow their antibiotic laden milk to contaminate your saleable milk and of course keep your dairy

    Enjoying the preview?
    Page 1 of 1