One of my favorite novels is A Confederacy of Dunces by Joh
One of my favorite novels is A Confederacy of Dunces by John Kennedy Toole. The novel tells the tragicomic tale of Ignatius J. Reilly, described in the foreword by Walker Percy as a “slob extraordinary, a mad Oliver Hardy, a fat Don Quixote, and a perverse Thomas Aquinas rolled into one.”
The novel was written in the 1960s before the age of computer filing systems, so one of the jobs Ignatius has is working as a paper filing clerk in a clothing factory. His employer is initially impressed with his job performance, since the disorderly mess of invoices and other paperwork slowly begin to disappear, resulting in the orderly appearance of a well organized and efficiently managed office space.
However, Ignatius is fired after he reveals the secret to his filing system—instead of filing the paperwork away into the appropriate file cabinets, he has simply been throwing all of the paperwork into the trash.
This scene reminds me of how data quality issues (aka data defects) are often perceived. Many organizations acknowledge the importance of data quality, but don’t believe that data defects occur very often because the data made available to end users in dashboards and reports often passes through many processes that cleanse or otherwise sanitize the data before it reaches them.
ETL processes that extract source data for a data warehouse load will often perform basic data quality checks. However, a fairly standard practice for “resolving” a data defect is to substitute a NULL value (e.g., a date stored in a text field in a source system that can not be converted into a valid date value is usually loaded into the target relational database with a NULL value).
When postal address validation software generates a valid mailing address, it often does so by removing what it considers to be “extraneous” information from the input address fields, which may include valid data accidentally entered into the wrong field, or that was lacking its own input field (e.g., e-mail address in an input address field deleted from the output valid mailing address).
And some reporting processes intentionally filter out “bad records” or eliminate “outlier values.” This happens most frequently when preparing highly summarized reports, especially those intended for executive management.
These are just a few examples of common practices that can create the orderly appearance of a high quality data environment, but that conceal a confederacy of data defects about which the organization may remain blissfully (and dangerously) ignorant.
Do you suspect that your organization may be concealing A Confederacy of Data Defects?