Data governance is for geeks?
- Think that master data management is boring and never gets the attention of your senior management?
- Think that master data management is boring and never gets the attention of your senior management?
- Is data governance a tedious “IT thing” that your executives choose to ignore?
- Struggling to make a business case for data quality, data stewardship and a corporate recognition of your data asset?
It is an acknowledged phenomenon that it’s many times harder to sell a potential improvement than to allay a potential risk… so perhaps the following apocryphal tale, of a hypothetical global product organisation, might serve to highlight the potential risk, and consequential cost, of ignoring master data management in this interconnected, automated, global marketplace in which we all operate…
Once upon a time…
Once upon a time, a global multi-billion-dollar organisation decided that it was time to launch a new product into the market. The organisation had invested in research and development to formulate the new product; invested in market research to assess potential consumer interest; invested in clinical trials to prove the efficacy and safety of the new product; invested in a marketing and advertising programme to inform potential consumers of the new product’s benefits; invested in trade promotion plan with its retail customers, who would create space for the new product and ensure it a prominent place in-store; invested in education for its employees on the importance and impact of the new product. It had invested tens of millions of dollars in identifying, researching, creating, branding, manufacturing, distributing, advertising and promoting the product to a global audience.
As with any investment, there was a certain amount of risk involved – there was always a possibility of manufacturing defects, weak demand, competitive counter-measures – but the executives at the organisation understood these risks, and had developed sophisticated processes and systems which reduced and managed these risks. And the systems and processes suggested that everything was under control, and that the product would prove a resounding success, producing a handsome return on the substantial investment.
The confidence of systems
The organisation could feel confident in its processes and systems because it, like many large organisations, had implemented a huge, expensive and all-encompassing business system. This system ran from a bank of computers in a global data centre, administered by a global IT team, and the scale and suitability of the system was not in question – it came from the world’s leading provider of such systems and, as a result, the organisation had learned to trust it as the infallible custodian of good business practice.
The global product launch approached and the marketers admired the modern branding and innovative style of the new packaging; the account managers delighted in the advance orders from their retail customers; the supply chain team hurried to arrange and manage shipping fleets; and the executives watched on as lorryloads of pallets of the new product moved swiftly from factories to distribution centres…
… and then a 13-digit number silently ruined everything.
Weeks before, when the product was approved for market launch, it had been recorded in the all-powerful business system. The system required that the product be allocated a product code – a 13-digit Global Trade Identification Number. The data entry clerk responsible for creating the definition of the product in the business system didn’t know the correct GTIN to use, but couldn’t create a product record without one, so – using his own initiative – found a GTIN for a product no longer in production (according to the business system) and decided to use it as a placeholder; the real GTIN could be supplied later.
The system validated that a 13-digit number had been entered, but not whether the code was valid and active – such information was easy to check (see http://gepir.gs1.org/v32/xx/gtin.aspx to check any of your GTINs!), but the version of the business system in place didn’t support such checks. Unbeknown to the data entry clerk, the out-of-production GTIN had previously been assigned to a product which contained a problem ingredient – an ingredient which had been withdrawn from the market several years before. As a result, the GTIN referred to a product which had been recalled in haste, with a warning to retailers that any instance of the product must be returned immediately. Retailers had duly recorded these recall instructions in their business systems, in memos to staff and on their equivalents of ‘most wanted’ posters.
The best laid plans…
As shipments of the new product were being scheduled for delivery to retailer stores and depots across the country, the business systems of those same retailers were flashing red alerts advising that the product must be returned immediately. The first phone call received at the organisation was taken by one of the supply chain team. He told his boss that one of their retail customers had just spotted the problem, and was calling for the correct GTIN, believing that the problem must be a typo.
The boss advised the sales director, who then burst in on the marketing director mid-meeting to get hold of sample products. Sure enough, when they checked the product packaging, the GTIN was exactly as advised to the retailers. The mistake was real – the GTIN from a recalled product had been reused, no-one had spotted it and millions of items all printed with a barcode of the offending GTIN were unsellable.
The true cost of poor data
The financial cost was huge; all of the products had to be scrapped (their contents could not be repackaged without quality concerns), but it didn’t stop there. Advertising slots booked across print, television, radio and online media were cancelled, but refunds were few and far between. Retail customers were even less forgiving; failing to meet agreed trade promotion commitments resulted in extensive commercial penalties. All in all the organisation lost £10m+ in the weeks that followed that first phone call. The impact on its reputation was equally damaging – several customers insisted on additional quality checks over the following months. These customers realised that the reuse of a GTIN was symptomatic of a broader problem; if procedures were not being followed correctly, then what use were quality assurance statements? What other systemic errors were in place awaiting discovery? One even insisted on its own teams reviewing the organisation’s information security and quality controls.
It took more than a year for the fallout from the debacle to dissipate. The organisation spent further millions in overtime pay, hiring additional temporary staff, reviewing procedures, pacifying customers and rushing through the product re-launch. The GTIN mistake was reviewed, disected, discussed and debated; consultants were hired to review and recommend changes; the IT system was upgraded to include GTIN checks, and – eventually – business returned to normal. But the organisation still saw data quality as an IT issue, still believed that the systems would substitute for data governance, and still treated data as “an IT thing”. The organisation’s executives salivated at conferences when Big Data was mentioned (on the hour, every hour).
And they all lived happily ever after…
Of course, such a tale could only be a fictional parable. No real business would make such a mistake, and certainly not yours.
Right?