Why Data Should Be a Business Asset: The 1-10-100 Rule

8 Min Read

ImageIf you run a business, or even just run a household, you’ll understand that chaos creates waste. How many times have you rushed to find an important document, such as a birth certificate, only to have to pay out for an emergency replacement? How many hours have you wasted searching for lost keys, lost passports and lost letters?

ImageIf you run a business, or even just run a household, you’ll understand that chaos creates waste. How many times have you rushed to find an important document, such as a birth certificate, only to have to pay out for an emergency replacement? How many hours have you wasted searching for lost keys, lost passports and lost letters? How many times have you wasted money on an emergency callout for a problem that could have been located and fixed?

Just as chaos at home is disruptive, disorganisation and chaos can have huge effects on business profitability. If employees don’t work to conventions, they risk creating problems for themselves, and for other employees down the line. Use of IT systems is strictly controlled for precisely this reason, and businesses work hard to ensure that data is stored in a controlled way. Otherwise, the chaos spreads like a virus.

But what happens when data is captured correctly, then naturally decays? Even the best laid plans cannot protect against this inevitable, and costly, data quality challenge.

The Data Decay Challenge

Poor data quality is one of the key causes of waste in enterprise. It creates rework and disorder for people who come into contact with that data. Businesses who understand data quality try to capture very high quality data to avoid contaminating a healthy database with junk.

But remember: data is linked to living beings, and to an ever changing world.

Let’s say a business captured your personal data on January 1st, 2004. At the point of capture, the member of staff you spoke to ensured the accuracy of that data. They may have run through checklists and used automated systems to support the data entry, resulting in a record that was completely clean.

On January 1st, 2014, would that record still be accurate? Even if it had not been edited, its contents would almost certainly have decayed. Jobs, children, houses and names all change over the years, affecting the overall integrity of data.

Why It Matters

Modern businesses are utterly reliant on data held in databases. From customer data to sales data, stock control to invoicing, the business builds its profits upon data. High quality data is directly linked to healthy profits. It should therefore be considered a business asset and kept up to standards with the help of data quality tools.

As we’ve demonstrated, healthy data doesn’t occur naturally. While verification is important at the point of entry, it cannot guarantee data quality long term. Likewise, spot checks or one-off data cleansing initiatives can only deal with the database on that day, in that moment. The day after the check is complete, the decay starts again.

1-10-100

To measure the effect of ageing on a database, we can look to the 1-10-100 rule. 1-10-100 was developed by George Labovitz and Yu Sang Chang in 1992 and is widely used as a tool to describe efficiency.

The 1-10-100 rule is applied in many scenarios concerned with quality and the cost of correction. It is expressed in US dollars in our example, but it can be understood to mean any number of ‘units’, measured in financial terms. We can also use 1-10-100 to measure ‘cost’ in terms of resources or time. For our purposes, the 1-10-100 rule can be applied to data quality challenges at various stages in the database lifecycle, since it illustrates the importance of maintaining a high standard of data quality continually rather than occasionally.

The rule applied to data is as follows:

  • Verifying the quality a record costs the business $1. This is known as the prevention cost.
  • Cleansing and deduplicating a record costs the business $10. This is the correction cost.
  • Working with a record that’s never cleansed costs $100. This is the failure cost.

We can break this down into a simple example.

If a record is added to the database on 1st January 2004, it cost the business around $1 to verify it on that date (or, to measure it another way, it could take 1 minute to complete). This is its prevention cost. It may involve checking the address against the postcode, ensuring the customer’s name is spelled correctly, and ensuring that the customer is not already listed in the database.

During a data quality initiative, a record is updated and restored to its former quality. This is the correction cost. We understand this to be ten times the effort, and ten times the resource implication, but it is often a necessary part of data maintenance.

However, if that record is never cleansed, never deduplicated and never restored to its former quality, it costs the business 100 times the initial outlay. Why? It represents a failure.

Failure to maintain data is a failure to maintain acceptable quality standards. It creates waste. Its effects ripple through the business as the erroneous data causes rework and chaos. The business is disorganised and cannot operate efficiently.

And, most importantly of all, the effects of this waste eat into profits.

The 1-10-100 rule cannot take into account the ricochet effect of poor data quality, such as customers that become irritated, or staff that find the database tiresome and difficult to work with. Over time, dissatisfied customers and staff can create additional costs that are more difficult to measure. These might include customer acquisition, conversion rates, reputation repair or staff churn.

The Action Plan

The 1-10-100 rule proves that sooner is better.

Data quality is not a problem that can be tackled once and forgotten, and tackling it soon and often is better than late or not at all.

Continual improvement and maintenance is the key to keeping waste to a minimum in business. The concept applies to data quality, too. Continual management of data is the only way to ensure waste does not become endemic.

Data quality is meaningless unless we attach some kind of cost implication. Using the 1-10-100 rule, we can achieve buy in from senior managers who are most interested in protecting profit. From there, we can catch problems early and insulate core business functions against decay.

Share This Article
Exit mobile version